skip to main content
10.1145/3677846.3677853acmotherconferencesArticle/Chapter ViewFull TextPublication Pagesw4aConference Proceedingsconference-collections
research-article
Open access

A Universal Web Accessibility Feedback Form: A Participatory Design Study

Published: 22 October 2024 Publication History

Abstract

The digital world is rapidly expanding in content, but not all individuals have equal access. In particular, it is challenging to recognize accessibility issues in dynamic web content. Advanced tools provide help, but can not identify and mitigate all accessibility issues with sufficient quality and reliability. This makes human feedback exceedingly valuable. However, existing accessibility feedback tools are inadequate. They lack comprehensiveness and guidance for users with disabilities. This study addresses this gap by proposing a universal feedback form that prioritizes the perspectives of users with disabilities, particularly visual impairment. Following a participatory approach, our proposed form incorporates design rationales based on insights from an exploratory literature review, online survey (N=40), focus group discussions (N=12), and expert interviews with web developers specialized in web accessibility (N=3). Furthermore, the form’s effectiveness was evaluated with in-depth feedback from users with visual impairments (N=5). Our evaluation confirms that comprehensible sub-tasks and compatibility with assistive technology aid users with disabilities in providing useful accessibility feedback. Based on our findings, we contribute design recommendations to the future advancement of universal web accessibility feedback forms.

1 Introduction

The Web Content Accessibility Guidelines (WCAG) have the "goal of providing a single shared standard for web content accessibility that meets the needs of individuals, organizations, and governments internationally" [30]. Driven by this goal, it became one of the leading accessibility guides [3] and the foundation of other international accessibility norms [12, 43]. Even though the WCAG 2.2 comprises 86 detailed Success Criteria, it is recommended by the WCAG creators of the W3C Web Accessibility Initiative (WAI) that those standards and user involvement should be combined for effective accessibility measures [29]. To accommodate this in the EU, a European Parliament Directive mandates the inclusion of feedback mechanisms for web accessibility on public sector websites [44].
Approaching users for input also helps in identifying barriers. While automated tools can verify the validity of code structure or the presence of alternative text for images, they provide varying results depending on the tool [48] or may fail to ascertain whether the alt-text accurately describes the image [34]. Consequently, a comprehensive approach considering diverse contexts and users’ unique needs is beneficial. User feedback also emerges as a vital component while shedding light on the experiences and challenges faced by people with disabilities [16, 18].
Existing user feedback tools as described in section 2.4 are not focused on the needs of people with disabilities or have limited engagement with those stakeholders. This hinders the tools’ effectiveness in meeting those user needs and often excludes them completely from providing valuable feedback and accessing the content by becoming a bottleneck for accessibility solutions. Based on this, we formulated our Research Question (RQ):
RQ: How to design a feedback tool that effectively enables users, particularly those with visual impairments, to facilitate high-quality accessibility feedback?
The participatory design (PD) approach is followed to address our RQ, involving two key stakeholder groups: users with visual impairments, and web developers with experience in web accessibility, who act as recipients of the feedback. While answering this question, we contribute with this study:
A standalone and browser-independent feedback form for web accessibility that seamlessly integrates into diverse websites.
Established design rationales for guiding the creation of an accessible feedback form dedicated to collecting accessible design feedback.
An understanding of the interaction of visually impaired users with a feedback form that empowers them to offer high-quality accessible design feedback.

2 Related Work

2.1 Web Accessibility Issues

Accessibility refers to the design, development, and resulting products, systems, and services [22] which ensure that a wide range of people with different abilities and disabilities can interact with, recognize, understand, navigate, and contribute to the web [9, 60]. The goal of web accessibility is to provide and enable access to web content regardless of users’ abilities [6, 28]. It does not differentiate between temporary, momentary, or permanent disabilities [15]. The WCAG is widely recognized as the most internationally adopted standard for web accessibility [3]. Accessibility compliance and laws ensure that minimum accessibility requirements are met [15]. Still, the WCAG do not fully cover all user concerns [50] or address usability issues [47, 49]. While the WCAG addresses various physical and cognitive disabilities, the user group of people with visual impairments benefit the most from thoughtful web design [13], as they encounter the most significant challenges when interacting with the web [1]. Visual impairment affects at least 2.2 billion people worldwide [42]. The inaccessibility impacts more and more people as the commonness of visual impairment is expected to increase due to population growth and aging [42]. In general, web accessibility is beneficial not only for people with disabilities but also for all users [20] because accessibility and usability complement each other [51].

2.2 (Automated) Tools and Techniques for Users with Impairments

Two basic requirements must be fulfilled to enable people with visual impairments to access and contribute to the web [10]. First, they must be able to use adaptive strategies and assistive technologies [13] on their chosen device. Second, content must be designed to be accessible and compatible with these technologies. Assistive technologies include screen readers, which convert the text of web pages into speech and allow navigation, for example, through headlines and links, pop-up and animation blockers, reading assistants, screen magnification, speech recognition, and volume control [15]. Automated testing can be used to check the accessibility of web pages and ensure compliance with guidelines. However, according to recent studies, over 95 % of the world’s top 1,000,000 websites fail basic automated audits [15, 61]. Most significantly, 96.1 % of WCAG compliance violations fall into six categories as shown in Table 1.
Table 1:
WCAG Failure Type202320222021
Low contrast text83.6 %83.9 %86.4 %
Missing alternative text for images58.2 %55.4 %60.6 %
Empty links50.1 %49.7 %51.3 %
Missing form input labels45.9 %46.1 %54.4 %
Empty buttons27.5 %27.2 %26.9 %
Missing document language18.6 %22.3 %28.9 %
Table 1: Most common violations of WCAG 2.0 in percentage across one million home pages (WebAIM, 2023)
As seen in Table 1 there have been no major improvements in these categories over the past three years. Other studies have come to similar conclusions [4, 62]. For all six categories, there are automatic test tools like WAVE1, axe2 or Equal Access Checker3 available. But inadequate and inconsistent performance of automatic accessibility tools and features reoccurs across various studies [2, 25, 45, 48, 59]. Mateus et al. [37] have discovered that involving users in accessibility evaluations uncovered distinct issues that automated tools overlooked. In line with these findings, Yesilada et al. [63] have recommended implementing a user-centered approach.

2.3 User-Centered and Participatory Design

As described in section 2.2 making web content accessible is not just about following guidelines [27]. It also needs to involve users in the design, development, and evaluation process to identify website shortcomings [56]. Therefore, obtaining user feedback and experiential information is essential for achieving accessible design [16].
User-Centered Design (UCD) is an appropriate method for actively involving users. In UCD the end-user and their needs are placed at the center of a product’s design and development process [35]. This approach is widely used in research [55, 58]. The UCD process is iterative and divided into four phases: analyze, specify, design, and evaluate, which can be used in each phase.
According to Rouse [52], design should consider all individuals rather than just the end-user. Therefore, he has broadened the definition of UCD to encompass Human-Centered Design (HCD) and to include stakeholders that are not necessarily users. There are many variations of UCD [58]. One of these approaches is Participatory Design (PD), a design solution for UCD [18]. PD aims to actively involve stakeholders in all stages of the product design process.

2.4 Existing Accessibility Feedback Mechanisms

Vigo et al. [59] have highlighted the limitations of commonly used accessibility evaluation tools and stressed the importance of selecting appropriate tools to improve coverage. To fill in the missing information of those tools and manual testing, the EU for example requires feedback mechanisms for public sector websites to enable users to report accessibility concerns [44].
Numerous commercial tools are available to facilitate user feedback, often seamlessly integrating with websites. Notable examples include BugHerd4 and Userback5. Also, crowd-feedback systems are becoming increasingly important in research [26, 41, 54]. However, these tools collect general user feedback and do not specifically focus on feedback regarding web accessibility. Oliveira et al. [39] have argued that precise descriptions of accessibility problems by users are necessary for feedback to be useful to recipients.
We have identified three relevant studies that investigated tools and processes for providing feedback on web accessibility:
(1)
Accessibility Evaluation Assistant [46], instructs inexperienced evaluators through a learning process and trains them to provide valuable feedback on web accessibility. This learning application is targeted at undergraduate students and guides users in creating feedback instead of sending feedback directly.
(2)
WebAnywhere-ABD (Accessibility by Demonstration) [11] is an extension for the open-source, web-based screen reader WebAnywhere. It allows the user to submit screen reader recordings, which excludes people without that specific software.
(3)
Alarcon et al. [5] introduced the Public Barrier Tracker (PBT) as a tool for users to provide feedback on the web accessibility of a website. It offers a database for users with knowledge of managing barriers. However, the optimal structure of the feedback tool, how to access the tool, and how to collect high-quality feedback have not been examined in the study of Alarcon et al. [5].
It is worth noting that, to the best of our knowledge, none of the three above-mentioned studies actively involved end-users with disabilities, or help to provide useful accessible design feedback in a taxonomy that is understandable to the feedback recipient. This study aims to bridge the identified research gap by designing a tool that effectively enables users, particularly those with disabilities, to facilitate high-quality accessible design feedback.

3 Method

To address this gap, we follow the PD approach to actively involve users with disabilities in the process. Our focus for this study is on people with visual impairments, given their significant global prevalence [15, 42]. The study methodology included an exploratory literature review, an online survey, focus group discussions with people with visual impairments, and three expert interviews with web developers focused on accessibility. Those procedures resulted in a prototype of an accessible design feedback tool as a form that was evaluated with potential users.

3.1 Initial Exploration

We aimed to design a tool for high-quality accessibility feedback on the base of a literature review, and an online survey.

3.1.1 Exploratory Literature Review.

Guided by the approach by Arksey and O’Malley for scoping studies [8] we searched ACM Digital Library and Google Scholar to answer the following:
What are the factors that have an impact on the web access and navigation experience for blind and visually impaired people?
What tools and adaptations are used by people with visual impairments in their web navigation practices?
What challenges do people with visual impairments face when browsing online?

3.1.2 Online Survey.

An online survey with the user group can achieve a more comprehensive understanding of the literature review findings [24, 55]. We followed Gideon’s [24] seven-step guide for the online survey design.
A total of 79 participants took part in the online survey with the screen reader accessible tool SoSci Survey6. 22 participants filled out that they had no visual impairment and 17 participants only completed the initial survey questions about age, gender, and visual impairment. Careless completion and lack of usefulness for the study resulted in their exclusion. The main analysis included 40 participants (age M = 35.69, SD = 15.76; 20 women, 19 men, 1 non-binary or not specified). Eyesight was self-reported shown in Table 2 based on the German standard Versorgungsmedizin-Verordnung – VersMedV (eng. Care Medicine Ordinance) [38].
Table 2:
 Visually impairedHighly visually impairedBlind
acuity5-30 %2-5 %<2 %
N91219
Table 2: Online Survey Participants ordered by visual acuity

3.2 Discovery Process

Based on the first insights, we build further knowledge with focus groups and developer interviews.

3.2.1 Focus Groups with Users.

After the online survey results, we organized two focus group discussions with 5 and 7 participants lasting two hours each. For the structure of our questionnaire, we followed Krueger and Casey’s methodology [32], which consists of an opening, introduction, transition, key, and closing section and recommends 5 to 10 participants. The resulting transcript was then manually checked for accuracy, anonymized, and analyzed with inductive coding.

3.2.2 Developer Interviews.

We used expert interviews as a qualitative method to gain insight into what constitutes high-quality feedback for developers [19]. The interviews were 30 minutes each and followed a semi-structured format[31]. To ensure consistency and a structured approach, we used a guide containing relevant open-ended questions and topics to explore during the interview [17]. Transcription and coding took place as in section 3.2.1. All interviewees (I.1-I.3) had at least 4 years of development experience and were focused on accessibility for 1 (I.1), 5 (I.2), and 3 years (I.3).

3.3 Prototyping

As a result of the discoveries in our literature review, focus groups, and interviews, we extracted Design Rationales (DRs) as comprehensive documentation of the decisions and directions regarding the prototype. Detailed arguments and the reasoning behind each conclusion result in a more comprehensive understanding of the product and decision-making process [14]. We incorporated a user experience expert specialized in web accessibility during the prototyping phase. Spinuzzi [55] and other studies [21] propose to include stakeholders in every phase except design. We tested the functionality of the prototype using various screen readers, web browsers, and automated test tools.

3.4 Prototype Evaluation

We evaluated the accessibility and usability of the prototypical feedback form, as well as the quality of the received feedback. For a controlled and realistic environment, we implemented the feedback form on a testing environment of an agency website7. We chose this website due to its various accessibility challenges and because the users don’t have specific adaptive strategies established for this site. We tasked the participants with visual impairments with evaluating the website’s accessibility freely, ensuring a comprehensive exploration that mirrors the natural user experience. Not only that, but we analyzed the collected feedback to obtain meaningful insights and completion patterns of the respondents. In addition, we delivered the collected feedback to the web developers from section 3.2.2 and conducted semi-structured qualitative interviews with some feedback providers.

4 Participatory Design of A Feedback Form for Accessibility

4.1 Initial Exploration

4.1.1 Results: Literature Review.

The exploratory literature review provided insights into the factors that impact web access and navigation experiences of the user group, as recommended by Spinuzzi [55]. Our findings show that the online experiences of users are shaped by their interaction strategies and skills, type of vision impairment, and privacy concerns. Two fundamental aspects are: mastering assistive technology [10, 53] and accessible content design [10]. For older users with loss of vision [33] and users with lack of technology experience, missing accessibility can be overwhelming [7]. Additionally, users with visual impairments face privacy concerns beyond technical aspects of accessibility [57]. Positively, cross-compatibility between assistive technologies has gained prominence, allowing for the simultaneous use of various tools [10].

4.1.2 Results: Online Survey.

Our online survey aimed to gain a deeper understanding of the users’ challenges. 30 out of 40 participants use screen readers, regardless of their level of visual impairment. Besides screen readers, various other assistive technologies like screen magnifiers, voice control, and braille displays have been reported by the participants. This multitude of technology combinations often results in the combination of tools. We allowed the participants to express their challenges in specific categories, listed in Table 3. We based the categories on the previous literature review.
Table 3:
Challenges mentionedFrequency
Login or CAPTCHA21
Website Structure and Presentation17
Advertisement and Cookie Pop-Ups15
Form Completion14
Menu Navigation10
Compatibility with AT and adaptive Strategies7
Notifications5
Keyboard Navigation4
Others5
Table 3: Challenges and how often they have been mentioned in the online survey
Enriching information by the survey respondents (R) for the challenges, mainly with assistive technology, include:
Login or CAPTCHA: CAPTCHAs are often only offered visually (R.39) and audio alternatives are complex to comprehend (R.11). Two-factor authentications can also be a barrier because of expiring codes (R.38).
Website Structure and Presentation: 11 out of 17 participants in this category specifically mentioned that complex and overloaded websites are difficult to navigate, especially with missing information (R.6, R.37). Fonts with serifs can be challenging to read because some letters, such as n and u look very similar (R.11).
Advertising and Cookie Pop-Ups: Pop-ups can cause disorientation for users (R.34, R.35), and not all pop-ups are properly announced (R.12, R.22). The default deactivation of cookies can also lead to restricted access to several websites (R.2, R.3).
Form completion: Multiple fields in a single line lead to problems (R.2) especially in combination with unclear labeling (R.39, R.40). Furthermore, date selections are often inaccessible (R.5). Sometimes, form correction overwrites filled-in fields unintentionally (R.21).
Menu Navigation: Inadequate control (R.15), skipped items (R.39), and poor responsiveness (R.11) can hinder accessibility.
Compatibility with assistive technologies and adaptive strategies: Graphical representations of information like text highlighting are missed (R.23) and certain components are frequently not correctly pronounced (R.2, R.12). Magnification users face challenges accessing non-responsive websites (R.26).
Notifications: Challenges arise in locating notifications due to premature disappearance (R.26) or their limited size (R.32).
Keyboard Navigation: Users with visual impairments typically rely on the keyboard as their primary navigation (R.37) while many websites do not support keyboard shortcuts and keystroke commands for navigation (R.06).
Other challenges mentioned by respondents: Alternative text is lacking (R.3) and image-based explanations are problematic (R.11). Accessibility of certain services, such as Microsoft Teams, may be limited (R.23), resulting in significant professional disadvantages.
Additional challenges uncovered through literature review: The problems with pronunciations as mentioned by R.2 and R.12 and wrong language are supported by Mateus et al. [36]. Power et al. [49] additionally describe a reoccurring lack of multimedia control and transformation of site content.

4.2 Discovery Process

The online survey and additional exploratory literature research provided a first in-depth look at the work tools and challenges of people with visual impairments. To gain necessary insights for the subsequent prototyping, we worked with two focus groups and interviewed three web developers.

4.2.1 Results: Focus Group.

We used an inductive coding approach and identified 97 codes across three hierarchical levels. Applying these codes to the two focus groups resulted in 519 codings, offering additional independent insight beyond what was obtained from the online survey. An entry task for all participants (P) was part of the focus group to begin the conversation. The task involves navigating two different websites and locating the feedback form. The group discussion revealed the following:
Opportunities and challenges of the entry task: While using the headline mode, a screen reader that displays only the headlines allows for a quick website overview (P.8). Participants who utilized the search function were able to locate the button “after trying two or three keywords” (P.4). Participants could use screen magnifiers to access and observe the website (P.7, P.9).
Motives for reporting problems: The availability of alternatives to a website or dependency on a specific website play a big role in motivation to report issues. If there is no way around the website, users are more likely to report issues (P.2). Unfortunately, sometimes it is a struggle to “[...] find the appropriate place to turn to” with feedback (P.3). But participants like P.4 also mentioned positive experiences, where they were informed within 24 hours that their reported issues have been resolved.
Expectations regarding the feedback tool: It should be user-friendly (P.10) with clear instructions (P.4) and a good structure (P.1, P.2) that does not change too often, to accommodate people with cognitive impairments who like to work with known structures (P.2). Sufficient contrast ratio (P.7) and compatibility with various web browsers and devices (P.2, P.6, P.8) while not modifying the page (P.10) have been mentioned. For the feedback recipient, the form should provide enough information (P.4) and clear problem descriptions (P.8).
Both focus groups agreed on using a form-based feedback tool to guide the feedback provider and ensure usefulness.

4.2.2 Results: Developer Interviews.

The main goal of interviewing three specialized web developers was to gather insights on the process of delivering effective feedback regarding web accessibility. With inductive coding, we identified 32 codes categorized into three hierarchical levels. This resulted in 175 codings from talking to the three interviewees (I). The interviews provided multiple insights that can be fundamental to the form. According to I.3 feedback quality increases while reported on the problematic site. Created Feedback should be clear, comprehensible, and help to understand the problems that occurred (I.2) while being concise (I.1, I.2) which aligns with the statements of section 4.2.1. The reproducibility of the problem is emphasized by all three web developers. An extra browser extension that "must be installed" is deemed critical by I.3 because the feedback form should be easily accessible (I.2).
If possible, developers find it important to ask questions to the feedback provider (I.1). It was mentioned that it may be advantageous to disclose any disabilities to the feedback recipient, enabling them to understand how the problem affects the user’s experience on the website (I.2). But this collides with privacy concerns and might not provide useful information because of complex disabilities (I.3).
Consequently, both the focus group participants and web developers agreed that a feedback form is the most common and effective way to provide feedback.

4.3 Prototyping

4.3.1 Design Rationales based on previous results.

Based on the literature review, online survey, focus groups, and expert interviews, we developed four key design rationales for the design of accessibility feedback tools:
(1)
Universal access with different assistive technologies and adaptive strategies to ensure a usable feedback form for all users. Access to the form should be provided in the best way possible.
(2)
A guided feedback process helps unfamiliar feedback providers with how to best report accessibility issues.
(3)
User empowerment and transparent information collection are important for users with disabilities who might be more sensitive to the details of the information they provide. The form should encourage users to provide details while explaining the purpose and strengthening users’ trust.
(4)
A holistic accessibility and usability focus makes sure the form is accessible according to established standards, addressing the specific needs of people with disabilities while incorporating usability to extend the form’s effectiveness to a diverse user base.

4.3.2 Implementation and Design.

To implement our four design rationales in our artifact, we devised 21 design features (DFs) that can be clustered into eight feature groups (FGs). The complete mapping with direct connections of these three categories is provided in Table 4. Further explanation of the DFs follows in section 4.3.3. The decision for a guided form has been made according to the feedback from the focus groups and developer interviews. This ensures consistent problem descriptions. Based on DR2 and DR3 we divided the provided feedback into different sections with three categories of necessary information: tools and adaptations, problem description, and contact details. To make the form usable across web pages, we developed a standalone, browser-independent feedback form.
Table 4:
Design RationaleFeature GroupDesign Feature
DR1: Universal AccessFG1: Locating and accessing the feedback formDF1: Link placement
DF2: Link naming
DF3: Link icon and aria-label
DF4: Separated browser tab
DR2: Guided Feedback ProcessFG3: Providing tools and adaptationsDF6: Checkboxes for tools and adaptations
DF7: Input field and explanation
DF8: Checkbox browser and operating systems
 FG4: Providing feedbackDF9: Subject field
DF10: Input problem description and explanation
DF11: File upload
 FG5: Providing contact detailsDF12: Contact details and explanation
DF13 Copy form
DR3: User Empowerment and Transparent Information CollectionFG2: Empowering by explanationsDF5: Initial explanation
 FG6: Submitting the formDF14: Submitting
DF15: No CAPTCHA
DF16: Reaction after submitting
 FG7: Contact PointDF17: Contact Point
DR4: Holistic Accessibility and Usability FocusFG8: Further accessibility and usability considerationsDF18: Responsive design
DF19: Headline navigation
DF20: Visual characteristics
DF21: Less required fields
Table 4: Connections between Design Rationale, Feature Groups, and Design Features
Figure 1:
Figure 1: Feedback Form Screenshot

4.3.3 Design Features.

Figure 1 provides a screenshot of the finished form with all implemented DFs.8 Ordered by FGs we present the DFs, while including corresponding focus groups, interviews, and literature review inputs.
FG1: Locating and Accessing the Feedback Form
DF1 Link Placement: A prominently placed link to the form (P.10) at the top or bottom (P.8).
DF2 Link Naming: To conform with established user behaviour of providing a searchable link title that includes "something with [the word] Barrier” (P.4).
DF3 Link Icon and Aria-label: Declare the link as a way to an external application for screen reader accessibility.
DF4 Separated Browser Tab: As indicated by survey findings, pop-ups pose challenges and users are in support of separate tabs (P.10).
FG2: Empowering by Explanations
DF5 Initial Explanation: Creates trust with users by transparent communication (P.1, P.2) encouraging them to provide detailed feedback while describing the purpose of the form.
FG3: Providing Information to Tools and Adaptations
DF6 Checkboxes for Tools and Adaptations: Allows users to choose between limited choices for easy decisions (I.2) regarding their used assistive technologies and adaptive strategies to better understand the problem (I.1-3).
DF7 Input Field and Explanation of used Technology: Text input fields give opportunity to provide further detailed explanation of users’ technical setups, as requested by the three developers.
DF8 Checkbox for Browser and Operating System: The developers and focus groups identified the need for this. Consequently, we introduced an optional checkbox, accompanied by an explanatory note and data privacy info, related to the automatic input of browser and operating system data after user confirmation.
FG4: Providing Feedback
DF9 Subject Field: The problem description follows an email format, with a subject and body. Categorization where not used because they may not always be obvious (P.2) and users question their usefulness (P.4).
DF10 Input Problem Description and Explanation: This provides an opportunity for a clear and concise description of the problem (P.1). We make suggestions for a complete understanding including the exact location of the problem (I.3). We set this field to mandatory to prevent incomplete submissions.
DF11 File Upload: As wished by I.1 feedback providers have the option to attach a file. Even though the usefulness of this feature for users with screen readers was questioned in the focus group, participants reported they could record and send the output of the screen reader (P.6).
FG5: Providing Contact Details
DF12 Contact Details and Explanation: The feedback provider can enter contact information for the feedback recipient in cases where questions arise. This is optional because users “[...] just want to decide freely what data [...] [they] want to disclose” (P.2). Feedback providers can stay informed about the problem-solving process with this.
DF13 Copy of Form: Before submitting feedback, we offer users a copy of the form via email (P.10) to keep a record of their feedback (P.6, P.10).
FG6: Submitting the Form
DF14 Submitting: Validates if all mandatory fields are filled out and concludes the process.
DF15 No CAPTCHA: For a smooth process of submitting feedback without any barriers (P.1) and because our research indicated that CAPTCHAs pose significant challenges.
DF16 Reaction after Submitting: Participants emphasized the importance of receiving confirmatory notifications that are not limited by time to ensure the success of their submissions.
FG7: Contact Point
DF17 Contact Point: In case of questions or issues regarding the form itself. This is only provided after the submit button to ensure that the feedback provider uses the form for submitting feedback.
FG8: Further Accessibility and Usability Considerations
DF18 Responsive Design: The responsive design is fundamental for magnification tool users.
DF19 Headline Navigation: Contributing to the DR2 we optimize for this important interaction concept of screen reader users.
DF20 Visual Characteristics: A font without serif was chosen and the used company colors adhere to min. contrast levels of WCAG AA. To ensure usability, we provide clear and easy-to-understand explanations in short sentences (P.2).
DF21 Less Required Fields: Only two mandatory fields exist in the form – problem description and privacy policy. Participants of the focus groups clearly expressed a preference for reducing the number of required fields.

5 Prototype Evaluation Results

We invited 166 associations, self-help groups for people with visual impairments, and our focus group participants to share the task of evaluating the website with their contacts. Overall, we received 19 feedback responses from people with visual impairments through our dedicated feedback form. Three participants did not enter relevant information regarding web accessibility. Finally, we received 16 accessibility reports for evaluation. After the feedback task, we reached out again to the feedback response providers and got the opportunity to interview five respondents who were coincidentally all impaired since birth. Further demographics are described in Table 5. Information on interviewees’ eyesight (categorized as in Table 2) and technical affinity was self-reported. The interviews with E.1-E.5 have been interviewed 20 minutes. Interviews were audio recorded and transcribed. We performed inductive coding resulting in a dataset of 91 codings.
Table 5:
IDSexAgeImpairmentAssistive technologiesTechnical affinity
E.1m38visually impairedScreen magnificationHigh
E.2m67visually impairedScreen magnificationVery low
E.3m22blindScreen reader
braille displays
Very High
E.4m60blindScreen readerHigh
E.5f26blindScreen reader
screen magnification
High
Table 5: Interview partners in the evaluation

5.1 Feedback Quality

The three web developers from the interviews in section 3.2.2 evaluated the quality of the gathered feedback. Each developer assessed all 16 feedback items to ensure a comprehensive evaluation. Criteria for evaluation have been based on the study by Oppenlaender et al. [40]:
Specificity: The provided feedback is sufficiently detailed, allowing for a clear understanding of the problem or proposed solution.
Reproducibility: Necessary information, such as the tools used, has been provided to reproduce the reported problem.
Actionability: It is possible to deduce how the problem can be addressed based on the feedback.
Conciseness: The feedback is brief and to the point.
Relevance: Evaluating how the feedback is perceived regarding its relevance and usefulness according to web accessibility.
Quality: Assessing the overall quality of the feedback.
Boxplots that illustrate the descriptive statistics of the responses across the six dimensions, including the overall score are provided in Figure 2. To enhance the understanding of the score distribution within each dimension, we overlaid the individual scores on the boxplots using a bee swarm visualization. Additionally Table 6 provides a list of characteristics for high-quality and inadequate feedback based on the evaluations of feedback items.
Figure 2:
Figure 2: Feedback item evaluation with quality criteria
For each criterion, we averaged the scores given by three developers. The average overall feedback score is 5.28 points on a scale from 1.00 (strong disagreement) to 7.00 (strong agreement). 3 of the 16 feedback items were evaluated with high scores of 7.00. The values per criteria range from 2.00 to 7.00 points, except for relevance and conciseness, where the minimum scores are 4.00 and 2.67, respectively. The four dimensions of specificity, actionability, quality, and conciseness show comparable median values between 4.83 and 5.17 points. In contrast, reproducibility and relevance exhibit higher median values (5.67 and 6.00 points). Relevance also achieved the highest average score and showed the lowest variance overall.
We identified additional insights and connections between the dimensions based on the provided comments and these ratings. On average, specificity received lower ratings (mean = 5.12 points) compared to reproducibility (mean = 5.48 points). An example of this is one feedback item that lacked specificity but it was reproducible with a screen reader, resulting in a higher reproducibility rating (I.1, I.3). Feedback relevance received higher average ratings (mean = 5.92 points) than specificity (mean = 5.12 points). This was shown in feedback item 12 which was of interest but incomprehensible, and “[...] no contact details are given” (I.3). Additionally if the respondent reports multiple issues in a single feedback, the developers perceive the feedback as less concise (I.1).
Table 6:
High-quality feedbackInadequate feedback
- exact position on the website
- easy to reproduce
- contact details for questions
- detailed tool / system information
- barrier and impact mentioned
- precise problem description
- insufficient problem description
- no contact details for questions
- unclear position on website
- unspecified tools
Table 6: Characteristics of varying feedback quality

5.2 Feedback Form Experience

Overall, all questioned feedback providers liked the concept of a feedback form for web accessibility. It was perceived as structured, easy to use, and enjoyable. Nonetheless, improvement suggestions were made and are described in section 5.2.1-5.2.5.

5.2.1 Feedback Form in General.

None of the feedback providers attached files. 14 out of 16 feedback items contained contact details while 8 of the 14 users wanted a copy of the feedback. Only 8 times browser and operating system info was shared. All interviewees emphasized that “it’s good to have some guidance for people” giving feedback (E.4) especially those “who don’t give feedback very often” (E.4). E.1 expressed a strong interest in integrating a feedback form into every website, further suggesting feedback “[...] should not only end up with the owner of the website, [...] but perhaps also with the [legal] body responsible” (E.1).

5.2.2 Link Naming and Placement.

Feedback providers have varying perceptions of the placement of the feedback form link. Some found it intuitively at the bottom of the page (E.4). Even though clear naming of the link was provided (E.3) some users wished for a more established formulation like reporting of barriers (E.4). The Feedback Provider E.1 stated “[...] if I hadn’t known that this feedback form existed there, I would never have noticed it at this position”. One screen magnification user first reported trouble identifying the link (E.2).

5.2.3 Guided and Empowered Feedback Process.

The form was experienced as a positive impact on the feedback provider’s ability to deliver high-quality input (E.4). Especially, E1. found the headlines to help guide them towards the necessary information. One interviewee admitted to not reading all of the descriptions, stating that “[...] reading is exhausting for me” (E.2). In contrast E.3 perceived the labels and descriptions as very helpful. Multiple interviewees mentioned the reduced number of mandatory fields as a positive aspect.
One noteworthy observation is that respondents inputted their feedback into the subject field. Because the description is set as mandatory some respondents duplicated the subject field input or referred to it. The developers did not express any concerns regarding this.

5.2.4 Accessibility and Usability.

Accessibility was assessed on a 10-point Likert scale by the interviewees. The accessibility rating of the form was 8.40 points. They mentioned reasons regarding the font, navigation, and arrangement. The technical setup including the vertical arrangement (E.2) made it fully functional (E.4) and easy to read (E.2). Some contrast issues arose with input fields not being visible enough (E.5) or the combination of the white background and relatively thin font (E.2). The initial description box was only legible for E.5 with a screen reader.
Multiple users had issues with recognizing the success notification after submitting the feedback and only interpreted the cleared form content or mail notification as confirmation. As a solution for this E.5 suggested to redirect to a different page layout with a success message which unfortunately would remove the function "of sending further feedback directly without having to navigate again” (E.5).

5.2.5 Further Improvements.

Optimization for the Form Mode of a screen reader that skips content is needed (E.4). It was recommended to give more info on the use of the email (E.1). Based on his technical affinity E.1 recommended the implementation of a CAPTCHA security reasons.

6 Discussion

To improve web accessibility, user feedback is a significant component using the participatory design approach, we involved multiple stakeholders in conceptualizing a feedback form. We worked with various methodologies, including online surveys, focus groups, and expert interviews to get an understanding of the user base. Our study revealed that the diversity of impairments requires a nuanced approach, leading us to consider further enhancements for a more comprehensive accessible feedback form.

6.1 Specialized Feedback Forms lead to High-Quality Feedback

The overall feedback quality score had a mean of 5.28 out of 7.00. We argue that achieving a score above the midpoint is commendable. However, the lack of a baseline for comparison limits our ability to conclusively determine the significance of this score. Our results of the evaluation showed that the collected feedback in the dimension of quality has the lowest mean of 4.98 points among all six dimensions. These results are consistent with the findings reported by Haug et al. [26]. This difference can be partially attributed to the inherent complexity of evaluating quality, which depends on various factors in the evaluation process. The high scores for reproducibility and relevance demonstrate that the feedback was understandable, and problems can be identified. In contrast, reproducibility and relevance exhibit the highest means (5.48 and 5.92 points, respectively). Thereby, relevance has overall the highest rating and lowest variation. The high relevance scores and low variance indicate that developers can easily understand the significance of web accessibility problems.

6.2 Various Unique Needs

We want to acknowledge that it is crucial to recognize that users have different abilities and that achieving conciseness may not be a strength for everyone. Therefore, it is beneficial to customize solutions for various users’ needs and capabilities [23]. Regarding this, it is worth noting that participants in our study, particularly those with technical expertise, preferred the feedback form placed in the footer. However, participants lacking experience in providing feedback and with low technical affinities perceived the form’s positioning as inconspicuous. These findings align with our exploratory literature review, highlighting the significance of users’ skills in navigating the digital realm and mastering assistive technologies. To aid this the notification is not time-limited to accommodate users with varied abilities. However, some participants did not visually recognize the success notification and suggested redirecting to a different page while others found it sufficient.
Despite our efforts to create an accessible feedback form, we acknowledge the diversity of visual impairments, which leads us to assert that a universally perfect accessible tool may be unattainable. Our research demonstrates that the form is already successfully accommodating a wide range of users, as demonstrated by all participants being able to provide feedback.

6.3 Enhancing User Engagement and Trust

We identified the user characteristic of being sensitive about sharing data. We assume that the concept of automatic data collection is unfamiliar to many users, leading to a lack of understanding about what specific information is being collected. An Interviewee mentioned a preference for manual input for the operating system and browser.
We also included a statement inviting users to share their contact information to receive updates on the problem-resolution process. This is supposed to demonstrate the seriousness with which we treat reported issues and motivate users to contribute feedback in the future. While we obtained a 50 % rate of contact details entries, we did not make the contact details mandatory but motivated a voluntary act on the users’ part.

7 Design Recommendations for Feedback Forms On Web Accessibility

Based on the results and implications discussed in the preceding sections, we derived eight design recommendations for designing feedback forms for web accessibility focusing on visually impaired people.
Provide a visible link to the feedback form: Link placement in the footer was intuitive for most of our interviewees. An additional button or menu item at the top of the page would be perfect.
Open the feedback form in a separate browser tab: Overlays and Pop-Ups are shown to be problematic. Display the form in a separate window.
Implement a clear headline structure: A logical sequence of headlines should be maintained to facilitate user navigation.
Ensure transparent communication: Why this data is being collected is meaningful to the user. Articulate the purpose of collecting user data. This transparent approach fosters user trust.
Promote a minimal amount of mandatory fields: It is crucial to have a form that can be personalized to users’ needs. Minimize mandatory fields to encourage user participation.
Evaluate and use CAPTCHAs with care: Whether it is necessary to implement CAPTCHAs depends on the balance between security and accessibility. If so they should be implemented thoughtfully.
Acknowledge user actions promptly: After submitting feedback, the user should receive an explicit confirmation. A new page layout with a confirmation message and a button to submit further feedback would improve the user experience.
Include a contact point in the feedback form: Recognize potential user challenges in completing the form. Alternative channels should be always provided.

8 Limitations and Future Work

The PD approach allowed us to actively involve users with visual impairments, as well as web developers. Even though PD plans for continuous artifact improvement based on user input – our study only covers each phase once. Our evaluation also identifies additional design issues that provide valuable starting points for further improvements. Spinuzzi [55] has proposed to involve stakeholders in every phase of PD. Improving the form during development proves more efficient than adjusting afterward.
It is essential to note that the focus on this specific group of people with visual impairments may limit the generalizability and comprehensiveness of our findings. Focusing exclusively on one disability group may result in overlooking the complex challenges posed by other disabilities, such as motor impairments, cognitive disabilities, and hearing impairments. Additionally, accessibility challenges often arise from complex interactions among various disabilities.
The study’s participant pool was biased toward individuals with high technical affinity. One interviewee specified a low technical affinity and had the most concerns regarding the feedback form for example with the link placement to the form. Our study included feedback items from 16 participants for the evaluation. We consider this sample size to be satisfactory.
We did not explore practical strategies for motivating individuals which could be a future research avenue. Customization of the feedback form as an additional feature could enhance the usability and accessibility of the form. Options to reduce the text within the form would benefit individuals who have challenges reading long text.
This study does not provide a comprehensive comparative analysis of the feedback mechanism’s design features, making it challenging to identify the most impactful elements.
A systematic categorization system within the database may be employed to enhance the organizational sorting, in anticipation of increased feedback volume on highly trafficked websites. Emerging technologies, such as text-based artificial intelligence, could automatically categorize feedback.

9 Conclusion

In any digital landscape, achieving web accessibility is an ongoing challenge, despite the existence of guidelines and automated testing tools. Current research on feedback tools for web accessibility lacks a comprehensive approach to gathering the information necessary for problem-solving and leaves out many users. Previous studies do not address the perspectives of feedback providers, particularly end-users with disabilities, and do not effectively enable them to provide useful accessible design feedback. European Union legislation [44] mandates feedback mechanisms for public sector websites and clearly shows its value. Our initiative is focused on designing a universal solution applicable to different websites that can work on both public and private sector websites.
The chosen form format prioritizes the needs of users with disabilities, particularly those with visual impairments. In a participatory design approach, stakeholders, including individuals with visual impairments and developers as feedback recipients, actively contributed to the implementation and design of the form through an online survey, focus groups, and interviews.
The resulting design rationales encompass universal access, a guided feedback process, user empowerment, transparent information collection, and emphasizing a holistic focus on accessibility and usability. In alignment with these DRs, we propose a feedback form that enables the collection of high-quality feedback from diverse users, including people with visual impairments.
Our prototype of the feedback form was evaluated with 5 feedback providers in qualitative interviews and 3 developers with expertise in web accessibility as feedback recipients. The user interviews revealed the accessibility and usability of our feedback form. This shows that users appreciate the feedback form because it guides with structure and explanations to high-quality accessibility feedback. In the feedback item evaluation, developers assessed the quality of the gathered feedback using a 7-point Likert scale across six dimensions. The average overall score was 5.28. This demonstrates the form’s effectiveness in facilitating feedback from all participants. The analysis of the collected feedback highlighted the need for detailed reproductions in problem descriptions.
Further research is needed, including the investigation of link placement which has the potential to extend the form’s reach and a comparative analysis of specific design features and other feedback mechanisms.
The identified 4 DRs, accompanied by 8 feedback form recommendations, serve as a foundation for designing more user-centered solutions for web accessibility.
In conclusion, our feedback form is valuable for collecting high-quality accessible design feedback. While a baseline of existing feedback mechanisms remains open, the evidence indicates a promising direction for future research and development in this domain. A practical solution for collecting high-quality accessible design feedback from users with visual impairments was possible due to a participatory design approach that included end-users and stakeholders.

Footnotes

8
An open source repository including the inovex form plus instructions to build your own local executions and production builds is provided under: https://github.com/human-centered-systems-lab/a11y-feedback

References

[2]
Shadi Abou-Zahra, Judy Brewer, and Michael Cooper. 2018. Artificial Intelligence (AI) for Web Accessibility: Is Conformance Evaluation a Way Forward?. In Proceedings of the 15th International Web for All Conference. ACM, Lyon France, 1–4.
[3]
Hayfa.Y. Abuaddous, Mohd Zalisham, and Nurlida Basir. 2016. Web Accessibility Challenges. International Journal of Advanced Computer Science and Applications 7, 10 (2016).
[4]
Patricia Acosta-Vargas, Luis Antonio Salvador-Ullauri, and Sergio Lujan-Mora. 2019. A Heuristic Method to Evaluate Web Accessibility for Users With Low Vision. IEEE Access 7 (2019), 125634–125648.
[5]
Diane Alarcon, Kim Andreasson, Justyna Mucha, Annika Nietzio, Agata Sawicka, and Mikael Snaprud. 2018. A Public Barrier Tracker to Support the Web Accessibility Directive. In Computers Helping People with Special Needs, Klaus Miesenberger and Georgios Kouroupetroglou (Eds.). Vol. 10896. Springer International Publishing, Cham, 22–26. Series Title: Lecture Notes in Computer Science.
[6]
Sophia Alim. 2021. Web Accessibility of the Top Research-Intensive Universities in the UK. SAGE Open 11, 4 (Oct. 2021), 215824402110566.
[7]
Andrew Arch, Shadi Abou-Zahra, and Shawn Lawton Henry. 2009. Older Users Online: WAI Guidelines Address Older Users Web Experience. https://www.w3.org/WAI/posts/2009/older-users-online
[8]
Hilary Arksey and Lisa O’Malley. 2005. Scoping studies: towards a methodological framework. International Journal of Social Research Methodology 8, 1 (Feb. 2005), 19–32.
[9]
Sher Badshah, Arif Ali Khan, Shahid Hussain, and Bilal Khan. 2021. What users really think about the usability of smartphone applications: diversity based empirical investigation. Multimedia Tools and Applications 80, 6 (March 2021), 9177–9207.
[10]
Armando Barreto and Scott Hollier. 2019. Visual Disabilities. In Web Accessibility, Yeliz Yesilada and Simon Harper (Eds.). Springer London, London, 3–17. Series Title: Human–Computer Interaction Series.
[11]
Jeffrey P. Bigham, Jeremy T. Brudvik, and Bernie Zhang. 2010. Accessibility by demonstration: enabling end users to guide developers to web accessibility solutions. In Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility. ACM, Orlando Florida USA, 35–42.
[12]
Governo Eletrônico Brasileiro. [n. d.]. eMAG - Modelo de Acessibilidade em Governo Eletrônico. https://emag.governoeletronico.gov.br/#s1
[13]
Peter Brophy and Jenny Craven. 2007. Web Accessibility. Library Trends 55, 4 (2007), 950–972. https://www.ideals.illinois.edu/items/3932
[14]
John M. Carroll and Mary Beth Rosson. 2003. Design Rationale as Theory. In HCI Models, Theories, and Frameworks. Elsevier, 431–461.
[15]
Sukriti Chadha. 2023. Beyond Accessibility Compliance: Building the Next Generation of Inclusive Products. Apress, Berkeley, CA.
[16]
Tim Coughlan, Thomas Daniel Ullmann, and Kate Lister. 2017. Understanding Accessibility as a Process through the Analysis of Feedback from Disabled Students. In Proceedings of the 14th International Web for All Conference. ACM, Perth Western Australia Australia, 1–10.
[17]
Barbara DiCicco-Bloom and Benjamin F Crabtree. 2006. The qualitative research interview. Medical Education 40, 4 (April 2006), 314–321.
[18]
Ezequiel Duque, Guilherme Fonseca, Heitor Vieira, Gustavo Gontijo, and Lucila Ishitani. 2019. A systematic literature review on user centered design and participatory design with older people. In Proceedings of the 18th Brazilian Symposium on Human Factors in Computing Systems. ACM, Vitória Espírito Santo Brazil, 1–11.
[19]
Stefanie Döringer. 2021. ‘The problem-centred expert interview’. Combining qualitative interviewing approaches for investigating implicit expert knowledge. International Journal of Social Research Methodology 24, 3 (May 2021), 265–278.
[20]
Francisco J. Estrada-Martínez, José R. Hilera, Salvador Otón, and Juan Aguado-Delgado. 2022. Semantic web technologies applied to software accessibility evaluation: a systematic literature review. Universal Access in the Information Society 21, 1 (March 2022), 145–169.
[21]
Suzan Evers, Gamze Z. Dane, Pauline E. W. Van Den Berg, Alexander K. A. J. Klippel, Timon Verduijn, and Theo A. Arentze. 2023. Designing healthy public spaces: A participatory approach through immersive virtual reality. AGILE: GIScience Series 4 (June 2023), 1–8.
[22]
International Organization for Standardization. 2021. Ergonomics of human-system interaction Part 20: An ergonomic approach to accessibility within the ISO 9241 series.
[23]
Ryan Fritz, Kim-Phuong L. Vu, and Wayne E. Dick. 2019. Customization: The Path to a Better and More Accessible Web Experience. In Human Interface and the Management of Information. Visual Information and Knowledge Management, Sakae Yamamoto and Hirohiko Mori (Eds.). Vol. 11569. Springer International Publishing, Cham, 3–21. Series Title: Lecture Notes in Computer Science.
[24]
Lior Gideon. 2012. The Art of Question Phrasing. In Handbook of Survey Methodology for the Social Sciences, Lior Gideon (Ed.). Springer New York, New York, NY, 91–107.
[25]
Kate S Glazko, Momona Yamagami, Aashaka Desai, Kelly Avery Mack, Venkatesh Potluri, Xuhai Xu, and Jennifer Mankoff. 2023. An Autoethnographic Case Study of Generative Artificial Intelligence’s Utility for Accessibility. In The 25th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York NY USA, 1–8.
[26]
Saskia Haug, Ivo Benke, Daniel Fischer, and Alexander Maedche. 2023. CrowdSurfer: Seamlessly Integrating Crowd-Feedback Tasks into Everyday Internet Surfing. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. ACM, Hamburg Germany, 1–16.
[27]
Silje Havrevold Henni, Sigurd Maurud, Kristin Skeide Fuglerud, and Anne Moen. 2022. The experiences, needs and barriers of people with impairments related to usability and accessibility of digital health solutions, levels of involvement in the design process and strategies for participatory and universal design: a scoping review. BMC Public Health 22, 1 (Jan. 2022), 35.
[28]
Yavuz Inal, Deepti Mishra, and Anne Britt Torkildsby. 2022. An Analysis of Web Content Accessibility of Municipality Websites for People with Disabilities in Norway: Web Accessibility of Norwegian Municipality Websites. In Nordic Human-Computer Interaction Conference. ACM, Aarhus Denmark, 1–12.
[29]
W3C Web Accessibility Initiative (WAI). [n. d.]. Involving Users in Web Projects for Better, Easier Accessibility. https://www.w3.org/WAI/planning/involving-users/
[30]
W3C Web Accessibility Initiative (WAI). [n. d.]. WCAG 2 Overview. https://www.w3.org/WAI/standards-guidelines/wcag/
[31]
Shazia Jamshed. 2014. Qualitative research method-interviewing and observation. Journal of Basic and Clinical Pharmacy 5, 4 (2014), 87.
[32]
Richard A. Krueger and Mary Anne Casey. 2015. Focus groups: a practical guide for applied research (5th edition ed.). SAGE, Los Angeles London New Delhi Singapore Washington DC.
[33]
Sri Kurniawan, Andrew Arch, and Sean-Ryan Smith. 2019. Ageing and Older Adults. In Web Accessibility, Yeliz Yesilada and Simon Harper (Eds.). Springer London, London, 93–119. Series Title: Human–Computer Interaction Series.
[34]
Jonathan Lazar, Patricia Beere, Kisha-Dawn Greenidge, and Yogesh Nagappa. 2003. Web accessibility in the Mid-Atlantic United States: a study of 50 homepages. Universal Access in the Information Society 2, 4 (Nov. 2003), 331–341.
[35]
Ji-Ye Mao, Karel Vredenburg, Paul W. Smith, and Tom Carey. 2005. The state of user-centered design practice. Commun. ACM 48, 3 (March 2005), 105–109.
[36]
Delvani Antônio Mateus, Carlos Alberto Silva, Arthur F. B. A. De Oliveira, Heitor Costa, and André Pimenta Freire. 2021. A Systematic Mapping of Accessibility Problems Encountered on Websites and Mobile Apps: A Comparison Between Automated Tests, Manual Inspections and User Evaluations. Journal on Interactive Systems 12, 1 (Nov. 2021), 145–171.
[37]
Delvani Antônio Mateus, Carlos Alberto Silva, Marcelo Medeiros Eler, and André Pimenta Freire. 2020. Accessibility of mobile applications: evaluation by users with visual impairment and by automated tools. In Proceedings of the 19th Brazilian Symposium on Human Factors in Computing Systems. ACM, Diamantina Brazil, 1–10.
[38]
Federal Office of Justice Germany. 2008. VersMedV - Verordnung zur Durchführung des § 1 Abs. 1 und 3, des § 30 Abs. 1 und des § 35 Abs. 1 des Bundesversorgungsgesetzes. https://www.gesetze-im-internet.de/versmedv/BJNR241200008.html
[39]
Alberto Dumont Alves Oliveira, Paulo Sérgio Henrique Dos Santos, Wilson Estécio Marcílio Júnior, Wajdi M Aljedaani, Danilo Medeiros Eler, and Marcelo Medeiros Eler. 2023. Analyzing Accessibility Reviews Associated with Visual Disabilities or Eye Conditions. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. ACM, Hamburg Germany, 1–14.
[40]
Jonas Oppenlaender, Elina Kuosmanen, Andrés Lucero, and Simo Hosio. 2021. Hardhats and Bungaloos: Comparing Crowdsourced Design Feedback with Peer Design Feedback in the Classroom. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–14.
[41]
Jonas Oppenlaender, Thanassis Tiropanis, and Simo Hosio. 2020. CrowdUI: Supporting Web Design with the Crowd. Proceedings of the ACM on Human-Computer Interaction 4, EICS (June 2020), 1–28.
[42]
World Health Organization. 2023. Blindness and vision impairment. https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment
[43]
European Parliament. [n. d.]. European Parliament Accessibility Information. https://www.europarl.europa.eu/portal/en/accessibility
[44]
European Parliament. 2016. DIRECTIVE (EU) 2016/ 2102 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL - of 26 October 2016 - on the accessibility of the websites and mobile applications of public sector bodies., 15 pages. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016L2102
[45]
Parvaneh Parvin, Vanessa Palumbo, Marco Manca, and Fabio Paternò. 2021. The transparency of automatic accessibility evaluation tools. In Proceedings of the 18th International Web for All Conference. ACM, Ljubljana Slovenia, 1–5.
[46]
Elaine Pearson, Chrstopher Bailey, and Steve Green. 2011. A tool to support the web accessibility evaluation process for novices. In Proceedings of the 16th annual joint conference on Innovation and technology in computer science education. ACM, Darmstadt Germany, 28–32.
[47]
Helen Petrie and Omar Kheir. 2007. The relationship between accessibility and usability of websites. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, San Jose California USA, 397–406.
[48]
Jonathan Robert Pool. 2023. Accessibility Metatesting: Comparing Nine Testing Tools. In 20th International Web for All Conference. ACM, Austin TX USA, 1–4.
[49]
Christopher Power, André Freire, Helen Petrie, and David Swallow. 2012. Guidelines are only half of the story: accessibility problems encountered by blind users on the web. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Austin Texas USA, 433–442.
[50]
Jose E. Reyes Arias, Kale Kurtzhall, Di Pham, Mohamed Wiem Mkaouer, and Yasmine N. Elglaly. 2022. Accessibility Feedback in Mobile Application Reviews: A Dataset of Reviews and Accessibility Guidelines. In CHI Conference on Human Factors in Computing Systems Extended Abstracts. ACM, New Orleans LA USA, 1–7.
[51]
Germania Rodríguez, Jennifer Pérez, Samanta Cueva, and Rommel Torres. 2017. A framework for improving web accessibility and usability of Open Course Ware sites. Computers & Education 109 (June 2017), 197–215.
[52]
William B. Rouse. 1991. Design for success : a human-centered approach to designing successful products and systems. Wiley. https://cir.nii.ac.jp/crid/1130282270098101120
[53]
Shrirang Sahasrabudhe. 2018. Understanding the interaction strategies of blind health IT users: a qualitative study. Ph. D. Dissertation.
[54]
Hanna Schneider, Katharina Frison, Julie Wagner, and Andras Butz. 2016. CrowdUX: A Case for Using Widespread and Lightweight Tools in the Quest for UX. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM, Brisbane QLD Australia, 415–426.
[55]
Clay Spinuzzi. 2005. The Methodology of Participatory Design. Technical Communication 52 (05 2005), 163–174.
[56]
Melanie Stade, Farnaz Fotrousi, Norbert Seyff, and Oliver Albrecht. 2017. Feedback Gathering from an Industrial Point of View. In 2017 IEEE 25th International Requirements Engineering Conference (RE). IEEE, Lisbon, Portugal, 71–79.
[57]
Abigale Stangl, Emma Sadjo, Pardis Emami-Naeini, Yang Wang, Danna Gurari, and Leah Findlater. 2023. “Dump it, Destroy it, Send it to Data Heaven”: Blind People’s Expectations for Visual Privacy in Visual Assistance Technologies. In 20th International Web for All Conference. ACM, Austin TX USA, 134–147.
[58]
Vanessa Thomas, Christian Remy, and Oliver Bates. 2017. The Limits of HCD: Reimagining the Anthropocentricity of ISO 9241-210. In Proceedings of the 2017 Workshop on Computing Within Limits. ACM, Santa Barbara California USA, 85–92.
[59]
Markel Vigo, Justin Brown, and Vivienne Conway. 2013. Benchmarking web accessibility evaluation tools: measuring the harm of sole reliance on automated tests. In Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility. ACM, Rio de Janeiro Brazil, 1–10.
[60]
World Wide Web Consortium W3C. 2023. Web Content Accessibility Guidelines (WCAG) 2.2. https://www.w3.org/TR/WCAG22/
[61]
WebAIM. 2023. WebAIM: The WebAIM Million - The 2023 report on the accessibility of the top 1,000,000 home pages. https://webaim.org/projects/million/
[62]
Rebecca Wettemann and Trevor White. 2019. THE INTERNET IS UNAVAILABLE. NucleusResearch.com Research Note, Program: Enterprise ApplicationsDocument T103 July 2019 (2019), 5.
[63]
Yeliz Yesilada, Giorgio Brajnik, Markel Vigo, and Simon Harper. 2015. Exploring perceptions of web accessibility: a survey approach. Behaviour & Information Technology 34, 2 (Feb. 2015), 119–134.

Index Terms

  1. A Universal Web Accessibility Feedback Form: A Participatory Design Study

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      W4A '24: Proceedings of the 21st International Web for All Conference
      May 2024
      220 pages
      ISBN:9798400710308
      DOI:10.1145/3677846
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 22 October 2024

      Check for updates

      Author Tags

      1. Accessibility
      2. Feedback
      3. Visual Impairments
      4. Participatory Design
      5. Form

      Qualifiers

      • Research-article

      Conference

      W4A '24
      W4A '24: The 21st International Web for All Conference
      May 13 - 14, 2024
      Singapore, Singapore

      Acceptance Rates

      Overall Acceptance Rate 171 of 371 submissions, 46%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 423
        Total Downloads
      • Downloads (Last 12 months)423
      • Downloads (Last 6 weeks)162
      Reflects downloads up to 02 Mar 2025

      Other Metrics

      Citations

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Login options

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media