MIT’s New Language for Big “Sparse” Data, Uber’s Self-Driving Debut, and More – This Week in Artificial Intelligence 09-16-16

MIT’s New Language for Big “Sparse” Data, Uber’s Self-Driving Debut, and More – This Week in Artificial Intelligence 09-16-16

1 – 10 Later-stage Startups Join the Seattle Machine Learning and Data Science Accelerator

Microsoft has launched its fourth Machine Learning Accelerator with 10 late-stage startups in Seattle. Chosen startups, spanning domains from manufacturing to social media, have received an average of $5.3 million in funding with $3 million average annual recurring revenue. Startups will gain access to a number of workshops and coaches, as well as business and customer connections through partnerships with the Sprosty Network and their RetailXelerator program. The startups selected include: Cycle Computing; DataRPM; Daometry; KenSci; LoginRadius; Metric Insights’; Pymetrics; Shareablee; Vision Insights; and Versium.

(The full press release on Microsoft Accelerator’s Blog has been taken down as of the update of this post in September 2017)

2 – Faster Parallel Computing

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have created a new programming language called Milk that works more efficiently with big data programs. In initial tests on several well-known algorithms, the programs written with Milk were four times speedier than those created with more traditional languages. Big data sets often include sparse data, for which today’s computer memory chips are not optimized. Instead of fetching single pieces of data, computer processors typically grab entire “blocks” of data according to a principle known as locality (also fetches as many adjacent blocks of information as it can hold). This doesn’t work well if an algorithm only wants to select 20 specific books out of an online retailer’s database of 2 million books. Milk works in such a way that when a piece of data is requested, it only goes to data items that it knows it needs. The research team presented its new language this week at the International Conference on Parallel Architectures and Compilation Techniques.

(Read the full article on MIT News)

3 – Pittsburgh, Your Self-Driving Uber is Arriving Now

On Wednesday, Uber sent it’s first batch of self-driving cars out into Pittsburgh (though not without ‘safety’ drivers in case something went wrong). In its press release, Uber cited its recent acquisition of Otto, which creates hardware and software for self-driving trucks for shipment and delivery, as fortifying its self-driving engineering group as one of the strongest in the world. One of Engadget’s writers published an article detailing his own experience in riding in one of Uber’s self-driving cars yesterday, noting that while it made most of the moves autonomously on the 30-minute ride, there were some situations in which it had more difficulty (such as dealing with human error at four-way intersections). No announcements yet on length of the trial or where self-driving Ubers may roll out next.

 

(Read the full article on UBER Newsroom)

4 – Brain-Sensing Technology Allows Typing at 12 Words Per Minute

Stanford scientists have developed technology that directly reads brain waves and drives a cursor over a keyboard at a rate of 12 words per minute – with monkeys. While slower versions of this technology exist, these new improvements apply to the underlying algorithms that translate eye signals into typed keyboard letters. Using monkeys trained to transcribe text on a computer screen (texts included articles from the The New York Times and sections of Hamlet), these latest tests showed significant improvements in both speed- up to three times faster – and accuracy. Krishna Shenoy, a professor of electrical engineering at Stanford, and postdoctoral fellow Paul Nuyujukian, who designed the technology, are part of the Brain-Machine Interface initiative of the Stanford Neurosciences Institute. The hope is that this new interface, which includes use of a multi-electrode array, can eventually be tested and used with people who have paralysis.

(Read the full article on Stanford News)

5 -NVIDIA Unveils Palm-Sized, Energy-Efficient AI Computer for Self-Driving Cars

NVIDIA unveiled its latest AI-based technology, a palm-sized computer meant to be installed autonomous vehicles for navigation and other auto cruise driving systems. The NVIDIA® DRIVE™ PX 2 AI computing platform builds on NVIDIA’s Drive PX processor, currently being used by 80 automakers, tier 1 suppliers, researchers and startups. This newest technology will serve as the AI engine for Baidu’s self-driving car; a partnership between the two companies was announced last week at Baidu World in Beijing. The DRIVE PX 2 will be available to other production partners in fourth quarter 2016.

(Read the full article at NVIDIA News)

Image credit: MIT Research News