SwiftKey Announces Symbol-Based Communication App For Non-Verbal People

11 Dec 2015 | Author: | No comments yet »

SwiftKey Announces Symbol-Based Communication App For Non-Verbal People.

SwiftKey, a keyboard app, has come to their rescue with an experimental application that aims to help these children and their families communicate with each other better. SwiftKey has announced a new app that uses symbols and pictures to help non-verbal people construct sentences, potentially helping millions of people more easily communicate.

SwiftKey today released a new keyboard called SwiftKey Symbols, it’s described as a “symbol-based assistive communication app” that’s primarily targeted at non-verbal individuals, the differently-abled who have difficulty communicating verbally. Earlier this year a group of SwiftKey employees, some with experience with autism in their families, came up with the idea of developing an assistive app powered by SwiftKey’s core contextual language prediction technology ‘We realized that SwiftKey’s core prediction and personalization technology – which learns from each individual as they use it – would be a natural fit for people on the autistic spectrum who respond particularly well to routine-based activity,’ as mentioned on the company’s blog. ‘Cells have a halo of metabolites (small molecules involved in metabolism, the set of chemical processes that maintain life) and nucleotides surrounding them. The company launched an experimental symbol-based assistive app today called SwiftKey Symbol, which it says can be used to build sentences using images. The app is called , and as its name suggests, the app uses symbols to help kids communicate using the company’s core personalization and prediction tech.

In fact, its original keyboard for Android (at time of launch) was arguably the smartest and best virtual keyboard for word prediction and customization. Cells threatened or damaged by microbes, such as viruses or bacteria, or by physical forces or by chemicals, such as pollutants, react defensively, a part of the normal immune response, Naviaux said, and communications between cells are dramatically reduced. With that, the system learns about the user and moves quickly to pick out symbols the child might want next, making it — according to the company — faster than other similar assistive apps on the market.

The idea is to provide an accessible and free app to individuals with talking and learning difficulties to make it easer for them to communicate with friends and family. The team has combined its predictive language keyboard tech and machine-learning capabilities with a range of hand-drawn everyday symbols to ensure speedy access to the right image at the right time in order to start forming sentences.

Swiftkey Symbols is free, and is part of Swiftkey’s Greenhouse team which handles the experimental apps projects including other apps like Swiftkey Neural Alpha, Clarity Keyboard, and Hexy Launcher. Individuals who use SwiftKey Symbols can build a sentence by using images, hand-drawn by a SwiftKey team member, from a prediction slider that’s powered by SwiftKey or a set of categories. The app, which comes out of Swiftkey’s experimental Greenhouse lab, takes into account what time of day it is and any relevant timetable commitments so irrelevant categories of images are avoided at certain times. There are dozens of images all listed in an appropriate category such as people, colors, toys and they can also use the smart suggestion bar to help them build their thought.

Assistive communication tools, such as visual boards, aim to help with that, but they’re slow and cumbersome and don’t always get the point across adequately. Users can add their own categories and personal images to make the choices more relevant to their lives, and the app also offers a speech-to-text feature so the sentence created can be played out loud. This option uses predictive language technology that can accurately guess the next word, expression or even suggest a word based on the images that have already been chosen. They wanted to build a free app which could be used by people with learning and talking difficulties, and help them better communicate with their carers and others with similar difficulties.

Swiftkey’s technology is already being used by Professor Stephen Hawking and Israeli startup Click2Speak address the needs of people with mobility issues. For example, if has an art class on Wednesday mornings and regularly communicates about that during that time, the app can use that knowledge to suggest symbols that the child is more likely to need. Earlier this month, the National Autistic Society (NAS), made a film to help people understand how some people with autism’s senses might be more sensitive. The screenshot above shows the beta’s current interface — it works by letting kids form sentences using symbols for things like activities, weather, objects, and places.

Although other apps make it easy to define favorites, only SwiftKey Symbols attempts to simplify finding the right symbols through machine learning prediction. It also works with any device running on Android 4.4 KitKat forward which is most devices at this point, so there shouldn’t be many lockouts based on software version. What’s more, users can create their own cards by adding images and categories and use audio playback for its text-to-speech feature which reads out a sentence for the user. This includes difficulty understanding and being aware of other people’s emotions and feelings and/or problems taking part in, or starting, conversations.

Patterns of thought are another key area, namely restricted and repetitive patterns of thought or physical movement, such as hand tapping or twisting, and becoming upset if these set routines are disrupted.

Here you can write a commentary on the recording "SwiftKey Announces Symbol-Based Communication App For Non-Verbal People".

* Required fields
Twitter-news
Our partners
Follow us
Contact us
Our contacts

dima911@gmail.com

ICQ: 423360519

About this site