i Human-Machine Interface · Dark Matter Industries

Human-Machine Interface

One interesting effect of the advancement of machine learning methods and tools is that researchers are now able tease out signals derived from non-invasive sensors on the skin to infer the firing of individual neurones inside the body. In much the same way features are extracted from photographs for classification, signals from electroencephalograms1, near-infrared spectroscopy2, and electromyography3 can be fed to a domain-aware machine learning model to tease out the action potential4 of individual nerves or muscle fibres.

These signals in turn can be fed into a computer, like a key-press on a keyboard or the press of a button on a joystick. In effect, this is an emerging method of communicating and controlling a computer.

There is a brace of machine-brain interface startups, including Elon Musk’s Neuralink5, Brain Robotics6, CTRL-labs7, mindmaze8, synchron9, kernel10, EMOTIV11, NeuroPace12, InteraXon13, OpenBCI14, neurable15, Paradromics16, BrainCo17, NeuroLutions18 and FaceBook.

There are potential applications in both the clinical and non-clinical domains.

On the clinical side, these interfaces are beginning to allow patients with loss of limbs or muscular function to interact with the world, for instance, through robotic arms and hands. Research in this area is not new. Basmajian19 reported successful experiments to create an electromyographic interface (invasively, that is by surgically implanting electrodes into muscle fibres) in 1963. However, it is only recently that machine learning has enabled non-invasive techniques to become viable.

Whether the source of the input signal is from muscles or directly from the brain, the effect is to seemingly control a computer or robot using just thought. Imagine being able to control a robots remotely with human-like dexterity in dangerous situations such as fighting forest fires.

More interestingly, researchers are thinking beyond the human body plan, that is, non-biomimetic control. Why stop at two arms? Can we train our brain to control an octopus-like robot, or the flight surfaces of a flying machine?


  1. Sensors placed in the skin on the head. ↩︎

  2. Sensors on the head that measure changes in blood-oxygen levels. ↩︎

  3. Sensors placed on the skin next to the muscles to detect neuromotor activity. ↩︎

  4. Action potentials are biological signals transmitted along nerves and muscles. ↩︎

  5. www.neuralink.com ↩︎

  6. www.brainrobotics.com ↩︎

  7. www.ctrl-labs.com ↩︎

  8. www.mindmaze.com ↩︎

  9. www.synchromed.com ↩︎

  10. kernel.co ↩︎

  11. www.emotiv.com ↩︎

  12. www.neuropace.com ↩︎

  13. choosemuse.com ↩︎

  14. openbci.com ↩︎

  15. neurable.com ↩︎

  16. paradromics.com ↩︎

  17. brainco.tech ↩︎

  18. neurolutions.com ↩︎

  19. BASMAJIAN JV. Control and training of individual motor units. Science. 1963 Aug 2;141(3579):440-1. doi: 10.1126/science.141.3579.440. PMID: 13969854. ↩︎