Simplifying text’s grammar and structure is a useful skill most of us acquire in school, but AI typically has a tougher go of it, owing to a lack of linguistic knowledge.

That said, scientists at Facebook AI Research and Inria are progressing toward a simplification model dubbed ACCESS (AudienCe-CEntric Sentence Simplification), which they claim enables customization of text length, amount of paraphrasing, lexical complexity, syntactic complexity, and other parameters while preserving coherency.

“Text simplification can be beneficial for people with cognitive disabilities, such as aphasia, dyslexia, and autism, but also for second language learners and people with low literacy,” wrote the researchers in a preprint paper detailing their work.

“The type of simplification needed for each of these audiences is different … Yet, research in text simplification has been mostly focused on developing models that generate a single generic simplification for a given source text with no possibility to adapt outputs for the needs of various target populations.

[We] propose a controllable simplification model that provides explicit ways for users to manipulate and update simplified outputs as they see fit.”

To this end, the team tapped seq2seq, a general-purpose encoder-decoder framework that takes data and its context as inputs.

The text above is a summary, you can read full article here.