R&D Roundup: ‘Twisted light’ lasers, prosthetic vision advances and robot-trained dogs

I see far more research articles than I could possibly write up. This column collects the most interesting of those papers and advances, along with notes on why they may prove important in the world of tech and startups.

In this edition: a new type of laser emitter that uses metamaterials, robot-trained dogs, a breakthrough in neurological research that may advance prosthetic vision and other cutting-edge technology.

Twisted laser-starters

We think of lasers as going “straight” because that’s simpler than understanding their nature as groups of like-minded photons. But there are more exotic qualities for lasers beyond wavelengths and intensity, ones scientists have been trying to exploit for years. One such quality is… well, there are a couple names for it: Chirality, vorticality, spirality and so on — the quality of a beam having a corkscrew motion to it. Applying this quality effectively could improve optical data throughput speeds by an order of magnitude.

The trouble with such “twisted light” is that it’s very difficult to control and detect. Researchers have been making progress on this for a couple of years, but the last couple weeks brought some new advances.

First, from the University of the Witwatersrand, is a laser emitter that can produce twisted light of record purity and angular momentum — a measure of just how twisted it is. It’s also compact and uses metamaterials — always a plus.

The second is a pair of matched (and very multi-institutional) experiments that yielded both a transmitter that can send vortex lasers and, crucially, a receiver that can detect and classify them. It’s remarkably hard to determine the orbital angular momentum of an incoming photon, and hardware to do so is clumsy. The new detector is chip-scale and together they can use five pre-set vortex modes, potentially increasing the width of a laser-based data channel by a corresponding factor. Vorticality is definitely on the roadmap for next-generation network infrastructure, so you can expect startups in this space soon as universities spin out these projects.

Tracing letters on the brain-palm

Startups – TechCrunch

The Sales Roundup: Why January.com Sold for $107,000, and More

 JamesNames.com: In the domain industry, we see sales figures daily. Services such as NameBio post hundreds of closed deals… The post The Sales Roundup: Why January.com Sold for $ 107,000, and More appeared first on James/Names.

R&D Roundup: Ultrasound/AI medical imaging, assistive exoskeletons and neural weather modeling

In the time of COVID-19, much of what transpires from the science world to the general public relates to the virus, and understandably so. But other domains, even within medical research, are still active — and as usual, there are tons of interesting (and heartening) stories out there that shouldn’t be lost in the furious activity of coronavirus coverage. This last week brought good news for several medical conditions as well as some innovations that could improve weather reporting and maybe save a few lives in Cambodia.

Ultrasound and AI promise better diagnosis of arrhythmia

Arrhythmia is a relatively common condition in which the heart beats at an abnormal rate, causing a variety of effects, including, potentially, death. Detecting it is done using an electrocardiogram, and while the technique is sound and widely used, it has its limitations: first, it relies heavily on an expert interpreting the signal, and second, even an expert’s diagnosis doesn’t give a good idea of what the issue looks like in that particular heart. Knowing exactly where the flaw is makes treatment much easier.

Ultrasound is used for internal imaging in lots of ways, but two recent studies establish it as perhaps the next major step in arrhythmia treatment. Researchers at Columbia University used a form of ultrasound monitoring called Electromechanical Wave Imaging to create 3D animations of the patient’s heart as it beat, which helped specialists predict 96% of arrhythmia locations compared with 71% when using the ECG. The two could be used together to provide a more accurate picture of the heart’s condition before undergoing treatment.

Another approach from Stanford applies deep learning techniques to ultrasound imagery and shows that an AI agent can recognize the parts of the heart and record the efficiency with which it is moving blood with accuracy comparable to experts. As with other medical imagery AIs, this isn’t about replacing a doctor but augmenting them; an automated system can help triage and prioritize effectively, suggest things the doctor might have missed or provide an impartial concurrence with their opinion. The code and data set of EchoNet are available for download and inspection.

Startups – TechCrunch