Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

State of HTML 2024 Suggestions #235

Open
SachaG opened this issue May 22, 2024 · 22 comments
Open

State of HTML 2024 Suggestions #235

SachaG opened this issue May 22, 2024 · 22 comments

Comments

@SachaG
Copy link
Member

SachaG commented May 22, 2024

I doubt this thread will see any replies because obviously we hit it out of the park on our first try and the State of HTML 2023 survey was 100% perfect, but just in case here you go.

@SachaG
Copy link
Member Author

SachaG commented May 22, 2024

@j9t
Copy link

j9t commented May 22, 2024

First, what a great and insightful survey! Really useful to have this one as part of the “State of…” series of surveys.

As shared by email (where I pointed to the respective public post), I missed and highly recommend to cover HTML conformance. For example, how authors think about conformance, what tooling they use, what could help them ensure conformant/valid output etc.

This seems relevant from several angles, the most important one being that only checking on conformance ensures that the HTML is actual HTML (i.e., doesn’t contain erroneous, non-working code).

From what is known validating many projects, most projects do contain HTML errors, so covering conformance in the survey could help identify causes for this situation (as well as raise awareness).

@SachaG
Copy link
Member Author

SachaG commented Jul 24, 2024

Some other things to improve for the 2024 edition:

"mini-features" questions

In the 2023 edition we had a distinction between "main features":

Screenshot 2024-07-24 at 20 58 33

And "mini-features" (which are questions that ask about a bunch of features at once):

Screenshot 2024-07-24 at 20 58 05

The idea was that we could be very granular with main features, while still covering a lot of ground with mini-features without losing too much data because we still asked for sentiment even for mini-features.

In the end though:

  1. I never had time to implement the code required to make mini-feature sentiment available through our API (or visualize it for that matter)
  2. I'm not sure it's even a good idea in the first place to collect data in two formats that are so close, yet not quite the same.

So I would be in favor of either:

  1. Only having main features (my preferred solution)
  2. Having main features + mini-features, but without collecting sentiment for the latter

Freeform questions

In the 2023 edition we had at least one freeform "pain points" question per section, and sometimes more.

Screenshot 2024-07-24 at 21 04 50

This led to a lot of data to analyze, and delayed the survey results. The idea this time around is to use the data collected in 2023 to populate predefined lists for 2024 (while still giving people an "other…" option).

That being said, defining the list of predefined options can in itself be a challenge, as some questions have very concrete results (form pain points -> styling select elements) while others are much more vague (web component pain points -> excessive complexity).

General length

Overall the 2023 survey was quite long, which I think makes sense because it was the first edition and we weren't quite sure what to include or not, so we erred on the side of including everything. But going forward it would be great to remove any question that isn't strictly necessary.

Basically, a question should be either interesting to web developers reading the results; or useful to browser vendors – or ideally both – in order to earn a spot in the survey.

@jgraham
Copy link

jgraham commented Aug 1, 2024

Some initial feedback from Mozilla's point of view:

  • In future editions we should make sure to include features that have recently shipped in all browsers or recently become baseline.
    Rationale: this ensures that we as browser vendors are working on solving real user problems. If we ship a lot of features that have negative sentiment even after they're widely available, we need to rethink how we assess features earlier in the lifecycle.
  • We should avoid asking about features that are not yet on a standards track, or don't have strong indications of cross vendor support. In this edition almost half the features we asked about had been used by <10% of the respondents. Instead we should focus on asking about the use cases early stage features are intending to address.
    Rationale: Developers can't provide useful assessment of features that they haven't used or even heard of. In that situation, any opinions they give are either based on whether the one-sentence description sounds plausibly useful, or perhaps on devrel content they've come across, usually from the team working on the feature. Obviously we do want to know that features are solving real problems, even in an early prototype stage. However framing those questions in terms of use cases rather than prospective solutions is likely to give much more useful insights.
  • We should ask about existing features in areas where we know there's room for improvement. For example we know that form controls and i18n are pain points, so we should ask for experience of existing features designed to solve problems in these areas, such as <input type=datetime> and <time> Corollary: we should ask fewer questions about areas of the platform that we don't think are likely to undergo active development.
    Rationale: Where we believe there's demand for improvements to the platform, we should look at the existing solutions in that area to understand where they work and where they fall short. That will help us ensure that any new developments are effective in solving user needs which are unmet by the current platform features.

@SachaG
Copy link
Member Author

SachaG commented Aug 1, 2024

@jgraham that all makes sense to me. Although regarding point 2, one reason to ask about features early on is to be able to track their progress over the years. For example, something might be at 2% awareness one year, then 50% the next, then 70% the next, which would indicate a really fast progression and be a sign of strong interest. If you didn't have that first 2% datapoint, you'd only have the 50% to 70% progression, which isn't as impressive.

@jgraham
Copy link

jgraham commented Aug 1, 2024

There's always a lot of features in the early prototype stage, and finding out that most people haven't heard of them isn't that surprising, or a very good use of survey taker's time. Last year we asked about 14 features that more than 75% of respondents hadn't even heard of, and 20 (with some overlap) that under 10% of survey takers had used. That seems far too high a proportion.

For things that are very early in the development process, and are likely to require substantial technical revision, general audience feedback just isn't that useful. Specification designers (or prospective implementers) would be better off working directly with small groups of target developers to understand whether their proposal is meeting the required use case (ideally from multiple places, to avoid designing features that e.g. work well for large product teams at big orgs but not outside that context).

Being able to retrospectively measure the growth of interest in the cases where we happen to pick a feature that does make it all the way through the standards process to cross-browser implementation just doesn't seem that useful compared to other, more immediately actionable, data we could be collecting with the survey.

@SachaG
Copy link
Member Author

SachaG commented Aug 2, 2024

Yeah that makes sense. Unless the JS/CSS surveys where I'm familiar with all items mentioned, or at least have a vague understanding of what they do, the HTML survey covered a lot of ground that I didn't know as much about. So I can't say I feel qualified to judge on this one way or the other. But I think erring on the side of making the survey shorter/simpler is probably the right thing to do.

@SachaG
Copy link
Member Author

SachaG commented Aug 9, 2024

So like I said previously, I want to get rid of "mini features" (defined as many features that are grouped under a single question heading, as opposed to each feature having its own question) to avoid having two different question formats asking about similar topics.

Here is the list of all mini-features that were in the 2023 survey, grouped by question. We can of course include some of them as "main features" if we think they should be part of the survey:

form_input_types

  • input_type_range
  • input_type_number
  • input_type_file
  • input_type_date
  • input_type_time
  • input_type_datetime_locale
  • input_type_month
  • input_type_week
  • input_type_color
  • input_type_email
  • input_type_tel
  • input_type_url

form_validation_features

  • required_attribute
  • pattern_attribute
  • input_set_custom_validity
  • input_report_validity
  • input_check_validity
  • invalid_event
  • valid_invalid_pseudo_class

dom_attribute_features

  • element_classlist
  • element_toggle_attribute
  • element_getattributenames

dom_html_features

  • element_innerhtml
  • element_outerhtml
  • element_insert_adjacent_html
  • domparser

dom_moving_element_features

  • element_insert_adjacent_element
  • element_append
  • element_before
  • element_replace_with
  • element_replace_children
  • document_create_document_fragment

interactivity_techniques

  • css_for_interactivity
  • vanilla_js
  • web_components
  • js_dom_libraries
  • js_framework

external_content_elements

  • picture_element
  • video_element
  • audio_element
  • iframe_element
  • object_element
  • svg_element
  • math_element
  • canvas_element

privacy_security_features

  • crossorigin_attribute
  • integrity_attribute
  • nonce_attribute
  • referrerpolicy_attribute
  • sandbox_attribute
  • allow_attribute

rel_attribute

  • prefetch_attribute
  • dns_prefetch_attribute
  • preconnect_attribute
  • preload_attribute
  • modulepreload_attribute
  • prerender_attribute

machine_readable_features

  • time_element
  • data_element
  • microdata
  • microformats
  • rdfa
  • json_ld

i18n_features

  • dir_attribute
  • ruby_rp_rt_elements
  • intl_datetimeformat
  • intl_collator
  • intl_listformat
  • intl_numberformat
  • intl_messageformat
  • intl_pluralrules
  • intl_relative_time_format
  • intl_segmenter
  • intl_localematcher

web_components_features

  • slot_element
  • part_attribute
  • part_pseudo
  • host_pseudos
  • slotted_pseudo_element
  • custom_elements_get
  • custom_elements_when_defined
  • custom_elements_get_name

pwa_app_manifest_fields

  • app_manifest_name
  • app_manifest_icons
  • app_manifest_shortcuts # chrome-only
  • app_manifest_file_handlers # chrome-only
  • app_manifest_share_target # chrome-only
  • app_manifest_launch_handler # chrome-only

local_storage_features

  • document_cookie
  • local_storage
  • session_storage
  • indexed_db
  • cache_storage

pwa_features

  • app_manifest
  • service_workers
  • notifications_api

@SachaG
Copy link
Member Author

SachaG commented Aug 13, 2024

Preview version: #246

@j9t
Copy link

j9t commented Aug 13, 2024

As shared by email (where I pointed to the respective public post), I missed and highly recommend to cover HTML conformance. For example, how authors think about conformance, what tooling they use, what could help them ensure conformant/valid output etc.

This seems relevant from several angles, the most important one being that only checking on conformance ensures that the HTML is actual HTML (i.e., doesn’t contain erroneous, non-working code).

I didn’t spot anything in the preview, but could the survey cover how people confirm they’re using HTML in the first place?

We know this is a problem (e.g., last year, 0 of the most popular sites used valid HTML code), and asking about conformance and validation could both help understand this better, but also nudge in the direction of using actual HTML code (which comes with the benefit of not wasting payload on code that doesn’t work).

I don’t think this needs a big section, just 2–3 questions. Happy to propose something.

@SachaG
Copy link
Member Author

SachaG commented Aug 24, 2024

@j9t I would love to have some suggestions on how to ask about conformance! If nothing else we could list some relevant tools in the "other tools" section.

@j9t
Copy link

j9t commented Aug 26, 2024

@j9t I would love to have some suggestions on how to ask about conformance! If nothing else we could list some relevant tools in the "other tools" section.

Here are some quick thoughts:

  • What tool(s) do you use for HTML validation and conformance checking?—e.g., W3C validator or validator.nu [though the same for living HTML], packages like html-validate, other tools
  • If you’re not checking on HTML conformance, why not?—e.g., lack of tooling, lack of priority, lack of awareness, influenced by HTML error tolerance
  • What would motivate you most to prioritize HTML conformance?—e.g., better/more tools, peers doing the same, employers requiring it, more emphasis during training

Select multiple-choice answers could be provided (fine-tuning and extending the examples above), but they could be open-ended as well.

If you have more preferences and constraints on your end, I’m happy to help tweak this further!

@b1tr0t
Copy link

b1tr0t commented Aug 30, 2024

Some initial feedback from Mozilla's point of view:

  • In future editions we should make sure to include features that have recently shipped in all browsers or recently become baseline.
    Rationale: this ensures that we as browser vendors are working on solving real user problems. If we ship a lot of features that have negative sentiment even after they're widely available, we need to rethink how we assess features earlier in the lifecycle.
  • We should avoid asking about features that are not yet on a standards track, or don't have strong indications of cross vendor support. In this edition almost half the features we asked about had been used by <10% of the respondents. Instead we should focus on asking about the use cases early stage features are intending to address.
    Rationale: Developers can't provide useful assessment of features that they haven't used or even heard of. In that situation, any opinions they give are either based on whether the one-sentence description sounds plausibly useful, or perhaps on devrel content they've come across, usually from the team working on the feature. Obviously we do want to know that features are solving real problems, even in an early prototype stage. However framing those questions in terms of use cases rather than prospective solutions is likely to give much more useful insights.
  • We should ask about existing features in areas where we know there's room for improvement. For example we know that form controls and i18n are pain points, so we should ask for experience of existing features designed to solve problems in these areas, such as <input type=datetime> and <time> Corollary: we should ask fewer questions about areas of the platform that we don't think are likely to undergo active development.
    Rationale: Where we believe there's demand for improvements to the platform, we should look at the existing solutions in that area to understand where they work and where they fall short. That will help us ensure that any new developments are effective in solving user needs which are unmet by the current platform features.

I agree with most of these points!

A few other things I'd like more information about:

  • I'm hearing anecdotally that developers are being asked to place a higher priority on accessibility by businesses than in the past. It would be good to have a question that asked whether developers expect they will have more, an equal or reduced emphasis on accessibility going forward.
  • Similar to j9t's questions about tools for validation / performance testing, what tools do developers rely on for accessibility testing?
  • would it be possible to have more freeform information about features developers are looking for, for example, Data Table, and Tabs elements are flagged as being missing but I'd love to better understand what features developers are really looking for, use cases etc. I'd happily parse raw feedback even if it's not visualized.

@b1tr0t
Copy link

b1tr0t commented Sep 3, 2024

I'd also love more feedback on whether with newer features developers feel like it is easier (or harder!) to achieve accessible designs. Perhaps through citing some examples of recent baseline features, such as popover?

@SachaG
Copy link
Member Author

SachaG commented Sep 5, 2024

It would be good to have a question that asked whether developers expect they will have more, an equal or reduced emphasis on accessibility going forward.

I'd also love more feedback on whether with newer features developers feel like it is easier (or harder!) to achieve accessible designs.

We could have opinion questions, which we used to have in some surveys. This would be something of the form:

I am being asked to place a higher priority on accessibility by businesses than in the past

- Strongly disagree
- Disagree
- Neutral
- Agree
- Strongly agree

@atopal
Copy link

atopal commented Sep 5, 2024

Can we change the "" entry to "Customizable " for this year? That is the new name for it: https://open-ui.org/components/customizableselect (maybe we keep in parentheses?)

@josepharhar
Copy link

Can we change the "" entry to "Customizable " for this year? That is the new name for it: https://open-ui.org/components/customizableselect (maybe we keep in parentheses?)

I think this meant to say:

Can we change the "selectlist" entry to "Customizable select" for this year? That is the new name for it: https://open-ui.org/components/customizableselect (maybe we keep in parentheses?)

@atopal
Copy link

atopal commented Sep 6, 2024

Yes, thanks for catching that! I forgot to escape the angle brackets.

@SachaG
Copy link
Member Author

SachaG commented Sep 10, 2024

So just so I'm clear, this went from <selectmenu>, to <selectlist>, to now just <select> but with this CSS snippet required?

select,
::picker(select) {
  appearance: base-select;
}

@josepharhar
Copy link

So just so I'm clear, this went from <selectmenu>, to <selectlist>, to now just <select> but with this CSS snippet required?

select,
::picker(select) {
  appearance: base-select;
}

Yes

@webdevUXR
Copy link

The "State of" surveys are such a fundamental tool for gauging web developer needs, I wonder if we could include a question about their satisfaction with the web platform overall and on of the broader challenges web devs face. For example:

For developer pain points, we could ask:

Here are some experiences that developers may find “painful” or frustrating. Rank them in order from most painful to least painful, using “1” for most painful and “8” for least painful:

  • Keep up with new features on the web platform
  • Deal with framework updates
  • Keep up with a large number of frameworks
  • Investigate and triage a bug
  • Understand and implement security measures
  • Test end-to-end user flows across browsers
  • Make a design work the same way across browsers
  • Deal with negative effects of browser updates

For Web Platform satisfaction, we could ask:

"How would you rate your overall satisfaction with the Web, as a platform and set of tools, to enable you to build what you need or want?"

  • Very satisfied
  • Somewhat satisfied
  • Neither satisfied nor dissatisfied
  • Somewhat dissatisfied
  • Very dissatisfied

@SachaG
Copy link
Member Author

SachaG commented Sep 13, 2024

@webdevUXR we actually do have questions that are very similar to this, and should hopefully provide us with good insights!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants