Artificial intelligence can now emulate human behaviors – soon it will be dangerously good

phys.org | 3/21/2019 | Staff
Click For Photo: https://3c1703fe8d.site.internapcdn.net/newman/gfx/news/hires/2019/22-artificialin.jpg







When artificial intelligence systems start getting creative, they can create great things – and scary ones. Take, for instance, an AI program that let web users compose music along with a virtual Johann Sebastian Bach by entering notes into a program that generates Bach-like harmonies to match them.

Run by Google, the app drew great praise for being groundbreaking and fun to play with. It also attracted criticism, and raised concerns about AI's dangers.

Study - Technologies - People - Lives - Problems

My study of how emerging technologies affect people's lives has taught me that the problems go beyond the admittedly large concern about whether algorithms can really create music or art in general. Some complaints seemed small, but really weren't, like observations that Google's AI was breaking basic rules of music composition.

In fact, efforts to have computers mimic the behavior of actual people can be confusing and potentially harmful.

Google - Program - Notes - Bach - Works

Google's program analyzed the notes in 306 of Bach's musical works, finding relationships between the melody and the notes that provided the harmony. Because Bach followed strict rules of composition, the program was effectively learning those rules, so it could apply them when users provided their own notes.

The Bach app itself is new, but the underlying technology is not. Algorithms trained to recognize patterns and make probabilistic decisions have existed for a long time. Some of these algorithms are so complex that people don't always understand how they make decisions or produce a particular outcome.

Google - Doodle - Team - Bach - Program

The Google Doodle team explains the Bach program.

AI systems are not perfect – many of them rely on data that aren't representative of the whole population, or that are influenced by human biases. It's not entirely clear who might be legally responsible when an AI system makes an error or causes a problem.

Intelligence - Technologies - Individuals

Now, though, artificial intelligence technologies are getting advanced enough to be able to approximate individuals'...
(Excerpt) Read more at: phys.org
Wake Up To Breaking News!
Hell sometimes looks an awful lot like an office cubicle.
Sign In or Register to comment.

Welcome to Long Room!

Where The World Finds Its News!