Brain → Handwriting

A computational method allowing the disabled to write and text.

Okezue Bell
6 min readOct 12, 2021

This is a technical article. Feel free to approach with no knowledge, or a lot of knowledge on animal physiology, neurology/neuroscience, math, AI, neurochemistry, neurobiology, engineering, and brain-computer interfaces. However, if you want to better understand the concepts described in this article, I’d definitely recommend reading up on the topic first.

Recent advancements of the biomedical fields, alongside neuroscience and interface development, has led to a sophisticated field of product development known as — they go by many names — brain-machine interfaces (BMIs), a class of electronic apparatuses that are capable of receiving, interpreting, and possibly manipulating the electrical signals of action potentials released by neuronal activity to stimulate physiological changes, or for controlling certain functions on digital devices.

To date, much of the BMI implementations have focused on chiefly two areas: neurological research and aesculapian* applications; there have been uses of BCIs being investigated in the robotics space as well, though that is a relative niche field of study with this technology.

*For you word enthusiasts (me!) out there, aesculapian means “related to medicine”. it’s an archaic Greek word derived from Apollo’s son Asclepius, the Greek god of medicine. As a Greek Mythology nerd, this plays into my upcoming reflection on the Odyssey and the Iliad.

Some of the most prominent examples of BMI technology being used are with the company Neuralink, as well as in independent projects, developing prosthetics, imaging the brain’s connectome, deciphering the neurological pathways that constitute the gut-brain axis, and automating robotic appendages for precise surgical uses.

What follows is a review of a BCIs that uses neural image reconstruction for brain to text to handwriting communication.

Neural control interfaces provide bidirectional communication pathways to allow for brain-to-machine interactions. This channel serves as an electrical bridge, based on the signals released by the brain during mental activity, leveraging a sodium-potassium electrochemical gradient for bio-sensing.

Current methods of measuring neural activity that is non-invasive have a high margin of error, which presents issues in bootstrapping data, and/or using individual brain data to inform machine function. Though semi-invasive methods are currently being developed, the most reliable BMIs as of now are intracortical.

In Stanford’s High-performance brain-to-text communication via handwriting, a novel method of cognitive restoration via neural interfaces is proposed, in which a patient's thoughts are converted to writing, as opposed to the more conventional approach of attempting to restore motor function to an individual through an implant, so that they could potentially be able to write. While this would be a viable long-term solution, it's highly unlikely that such therapeutics will be available within the next decade.

While the concept of P300-event keyboard/mouse typing accomplishes a similar goal in terms of communication, it doesn’t exactly replicate user intent or accounts for more dextrous and nonlinear actions, just as writing in cursive or drawing instead of writing.

This new BCI instead collects data from the motor cortex and uses RNN-based neural decoding to reconstruct the concept of writing for a patient who has suffered SCI paralysis of his/her hand. The participant achieved typing speeds of 90 characters per minute with 94.1% raw accuracy online, and greater than 99% accuracy offline with a general-purpose autocorrect — this is close to the average typing speed (115 wpm) of the participant’s demographic!

So, how did they do it?

First, they began by determining the optimal locus of the BCI. Previous literature already provided the researchers with the intuition that intention for general motor functions is informed by the motor cortex, thus its name. However, they still needed to investigate whether or not more sophisticated motor actions could be generalized to the motor cortex, so they also recorded electrical activity from the precentral gyrus (Brodmann Area 4 [PG, BA4] neural site):

Neuralink-Fig shows the hand knob epsilon-shaped gyrus on the PG responsible for hand control potency.

Data collected from the hand knob sections of the precentral gyrus via microelectrode arrays (hands limited to µmotive movements) from a tetraplegic participant T5 was done to reconstruct handwriting, thereby decoding rapid, dextrous movements for paralyzed individuals.

Using PCA, the researchers were able to render high-variance neural dimensions to create correlations to the imagined pen’s activity on ruled paper, and how that would construct a letter. 30% of the neural variance was the linear decoding of the pen tip velocity [the vector representation of how fast and in which direction the pen was moved (and ultimately the displacement of the pen across the paper as well)]. Essentially, the participants were imagining writing on paper.

To re-visualize the high dimensionality data of the top three principal components, the researchers used t-distributed stochastic neighbor embedding or t-SNE for dimensionality reduction to a 2D visualization. The data was also time-warped to minimize temporal variability.

e. Note that t-SNE plots can be deceiving to interpret. The hyperparameters have a huge effect on the result, and the sizes of the data clusters are negligible, as dense clusters are naturally expanded (the opposite is true for sparse clusters) by t-SNE. The significance of cluster distance can also be quite elusive; here some of the clusters aren’t well separated, meaning their distances hold some significance. d. handwriting rendering for each character from pen-tip trajectory (orange = start). b + c. principal component analysis example for d, e, and m; c is time-warped neural activity that minimizes fluctuations/error between trials.

The t-SNE plot above illustrates the groupings of neural activity for different alphabet members. In the case of overlapping clusters, these characters are written similarly, meaning their motor encodings are homogenous.

Then, simply by implementing nearest neighbor classification, the research concluded a 94.1% accuracy with a 95% confidence interval (CI) = [92.6, 95.8], meaning that residual motor cortex activity for dextrous actions is still a workable BCI option for long-paralyzed individuals.

Using an RNN to append letters to create sentences, the Stanford neuro-researchers was also able to develop a sentence decoding system.

20-ms bins of neural activity would be converted by an RNN to a pt-d, or a probability time series (as opposed to the original neural population time series xt), which served as a corrective prediction for each new character with a 1s delay, which resulted in higher certainty. Thresholding the character probability gave what the researchers called a raw online output — the real-time decoding output or an offline output (a retrospective conclusion) could be computed from a probability-based offline Viberti search informed by a custom 50,000-word bigram language model. The Mean character and word error rates (with 95% CIs) for the handwriting BCI across all 5 days, is shown:

After 5 days of retraining, it was found that there was better performance in unsupervised retraining of the decoder. It was also discovered that Increased temporal variety can make movements easier to decode!

Overall, the prospect of re-enabling written communication for those who are paralyzed is quite exciting and sparks discussion for a more robust implementation of such an expansive technology. Thanks to the researchers (Willett, et. al.) who designed, executed, and published this study and their findings! I’d recommend checking out and reading the paper to get all the details of the BCI they developed.

Before you go…

My name’s Okezue, a developer and researcher obsessed with learning and building things, especially when it involves any biology or computer science. Check out my socials here, or contact me: i@okezuebell.com.

I write something new every day/week, so I hope to see you again soon! Make sure you comment and leave some claps on this too — especially if you liked it! I sure enjoyed writing it! ✌🏾

Twitter | LinkedIn | Website | Newsletter

© 2021 by Okezue Bell. All Rights Reserved.

--

--

Okezue Bell

Social technologist with a passion for journalism and community outreach.