IBM Team Up With MIT To Help AI See And Hear Like Humans

We Talk About anything Science and Technology Here

2 people like this topic, 6 people dislike this topic

User avatar
hunter
Posts: 717
Joined: Thu Jun 11, 2015 11:50 am

IBM Team Up With MIT To Help AI See And Hear Like Humans

Postby hunter » Tue Nov 15, 2016 10:51 am

IBM and MIT have begun a “multi-year” partnership that aims to improve AI’s ability to interpret sight and sound as well as humans. IBM will supply the expertise and technology from its Watson cognitive computing platform, while MIT will conduct research. It’s still very early, but the two already have a sense of what they can accomplish.
Image


The new idea would be led by head of MIT’s Department for Brain & Cognitive Science – Jim DiCarlo. That Department and CSAIL will contribute members to the new lab, as will IBM’s Watson team. The idea is to engender jolly and hopefully fruitful mutual aid.

One of the biggest challenges will be to advance pattern recognition and prediction. A human can easily describe what they saw happen in an event and predict what happens next, IBM says, but that’s virtually “impossible” for current AI. That ability to quickly summarize and foresee events could be useful for everything from health care workers taking care of the elderly to repairing complicated machines, among other examples.

There’s no guarantee that IBM and MIT will crack a problem that has daunted Google, Facebook and countless academics. However, it’s rare that scientists get access to this kind of technology. You might just see breakthroughs that aren’t practical for teams that have only limited use of AI-friendly hardware and code.

The MIT partnership is one of several IBM has established lately; the company’s VP of Cognitive Computing, Guru Banavar, details the rest in a blog post.

The collaborations would help in pursuing AI in decision making, cybersecurity, deep learning for language, and so on. Also IBM is definitely making a huge investment in the foundation of AI work.

“We are in the process of building a system of best practices that can help guide the safe and ethical management of AI systems,”wrote Banavar, “including alignment with social norms and values.”

The new IBM-MIT Laboratory for Brain-inspired Multimedia Machine Comprehension — we’ll just call it BM3C — is a multi-year collaboration between the two organizations that will be looking specifically at the problem of computer vision and audition.

The problem of computer vision spans multiple disciplines, so it has to be attacked from multiple directions.


Return to “Science/Technology”

Who is online

Users browsing this forum: No registered users and 1 guest

cron