Over the last few months we have been concentrating on projects related to automated web accessibility checks and the automatic linking and categorisation of open licenced and freely available Augmentative and Alternative Communication symbol sets for those with complex communication needs.
As has been mentioned we presented these projects at a workshop in the Alan Turing Institute in November and work has been ongoing. It is hoped that the results will be shared by the end of March 2020.
Automating Web Accessibility Checks
Recent regulations and UK laws recognise the W3C Web Content Accessibility Guidelines (WCAG) as a method of ensuring compliance, but testing can be laborious and those checkers that automate the process need to be able to find where more errors are occurring. This has led to the development of an accessibility checker that carries out well-known automated checks, but also includes image recognition to make it possible to see if the alternative text tags for images are appropriate. A second AI related check involves a new WCAG 2.1 Success Criteria 2.4.4 Link Purpose (In Context). This is where “the purpose of each link can be determined from the link text alone or from the link text together with its programmatically determined link context, except where the purpose of the link would be ambiguous to users in general”.
A Natural Language Processing (NLP) model is used to check whether
the text in the aria-label attribute within the target hyperlink object matches
the content in the target URL. Based on the matching result, it is possible to
determine whether the target web page or website fit the link purpose criteria.
Despite previous research in this area, the task is proving challenging with two
different experiments being worked on. One experiment has been designed to use
some existing NLP models (e.g. GloVe), while another one is investigating the
training of data with human input. The results will be published in an academic
paper and at a conference.
AAC symbol classification to aid searches.
The team have also investigated issues for those supporting Augmentative and Alternative Communication (AAC) users who may have severe communication difficulties and make use of symbols and pictures on speech generating devices. A multilingual symbol repository for families, carers and professionals has been created to link different freely available symbol sets. The symbol sets can be used to create communication charts for the AAC user but this takes time and finding appropriate cultural symbols is not always easy. A system has been developed that automatically links and categorises symbols across symbol sets related to their parts of speech, topic and language using a combination of linked data, natural language processing and image recognition. The latter is not always successful in isolation as symbols lack context and concepts are not necessarily concrete such as an image for ‘anxious’, so further work is required to enhance the system. The Global Symbols AAC symbol repository will be making use of these features on their BoardBuilder for making symbol charts by the end of March 2020.
This project is exploring some existing Convolutional Neural Network (CNN, or ConvNet) models to help classify, categorise and integrate AAC symbols. Experiments have already been undertaken to produce a baseline by simply using the image matrix similarity. Due to the nature of AAC symbols, some of these similar symbols are representing different concepts, but some different symbols are representing the same concept across different symbols sets. The training data set has mapped symbol images labels and NLP models have been used to map the labels into the same concept across different symbols. This will help those supporting ACC users offer much wider symbol choices suitable for different cultures and languages. The Global Symbols API for searching open licence and freely available AAC symbols is already being used in the Cboard application for AAC users