{"id":35172,"date":"2017-06-16T08:39:00","date_gmt":"2017-06-16T08:39:00","guid":{"rendered":"https:\/\/new.igihe.com\/robot-uses-deep-learning-and-big-data-to-write\/"},"modified":"2017-06-16T08:40:06","modified_gmt":"2017-06-16T08:40:06","slug":"robot-uses-deep-learning-and-big-data-to-write","status":"publish","type":"post","link":"https:\/\/new.igihe.com\/english\/robot-uses-deep-learning-and-big-data-to-write\/","title":{"rendered":"Robot uses deep learning and big data to write and play its own music"},"content":{"rendered":"<p>{A marimba-playing robot with four arms and eight sticks is writing and playing its own compositions in a lab at the Georgia Institute of Technology. The pieces are generated using artificial intelligence and deep learning.}<\/p>\n<p>Researchers fed the robot nearly 5,000 complete songs &#8212; from Beethoven to the Beatles to Lady Gaga to Miles Davis &#8212; and more than 2 million motifs, riffs and licks of music. Aside from giving the machine a seed, or the first four measures to use as a starting point, no humans are involved in either the composition or the performance of the music.<\/p>\n<p>The first two compositions are roughly 30 seconds in length. The robot, named Shimon, can be seen and heard playing them at https:\/\/www.youtube.com\/watch?v=j82nYLOnKtM and https:\/\/www.youtube.com\/watch?v=6MSk5PP9KUA.<\/p>\n<p>Ph.D. student Mason Bretan is the man behind the machine. He&#8217;s worked with Shimon for seven years, enabling it to &#8220;listen&#8221; to music played by humans and improvise over pre-composed chord progressions. Now Shimon is a solo composer for the first time, generating the melody and harmonic structure on its own.<\/p>\n<p>&#8220;Once Shimon learns the four measures we provide, it creates its own sequence of concepts and composes its own piece,&#8221; said Bretan, who will receive his doctorate in music technology this summer at Georgia Tech. &#8220;Shimon&#8217;s compositions represent how music sounds and looks when a robot uses deep neural networks to learn everything it knows about music from millions of human-made segments.&#8221;<\/p>\n<p>Bretan says this is the first time a robot has used deep learning to create music. And unlike its days of improvising, when it played monophonically, Shimon is able to play harmonies and chords. It&#8217;s also thinking much more like a human musician, focusing less on the next note, as it did before, and more on the overall structure of the composition.<\/p>\n<p>&#8220;When we play or listen to music, we don&#8217;t think about the next note and only that next note,&#8221; said Bretan. &#8220;An artist has a bigger idea of what he or she is trying to achieve within the next few measures or later in the piece. Shimon is now coming up with higher-level musical semantics. Rather than thinking note by note, it has a larger idea of what it wants to play as a whole.&#8221;<\/p>\n<p>Shimon was created by Bretan&#8217;s advisor, Gil Weinberg, director of Georgia Tech&#8217;s Center for Music Technology.<\/p>\n<p>&#8220;This is a leap in Shimon&#8217;s musical quality because it&#8217;s using deep learning to create a more structured and coherent composition,&#8221; said Weinberg, a professor in the School of Music. &#8220;We want to explore whether robots could become musically creative and generate new music that we humans could find beautiful, inspiring and strange.&#8221;<\/p>\n<p>Shimon will create more pieces in the future. As long as the researchers feed it a different seed, the robot will produce something different each time &#8212; music that the researchers can&#8217;t predict. In the first piece, Bretan fed Shimon a melody comprised of eighth notes. It received a sixteenth note melody the second time, which influenced it to generate faster note sequences.<\/p>\n<p>Bretan acknowledges that he can&#8217;t pick out individual songs that Shimon is referencing. He is able to recognize classical chord progression and influences of artists, such as Mozart, for example. &#8220;They sound like a fusion of jazz and classical,&#8221; said Bretan, who plays the keyboards and guitar in his free time. &#8220;I definitely hear more classical, especially in the harmony. But then I hear chromatic moving steps in the first piece &#8212; that&#8217;s definitely something you hear in jazz.&#8221;<\/p>\n<p>Shimon&#8217;s debut as a solo composer was featured in a video clip in the Consumer Electronic Show (CES) keynote and will have its first live performance at the Aspen Ideas Festival at the end of June. It&#8217;s the latest project within Weinberg&#8217;s lab. He and his students have also created a robotic prosthesis for a drummer, a robotic third arm for all drummers, and an interactive robotic companion that plays music from a phone and dances to the beat.<\/p>\n<figure class=\"spip-document spip-document-20822 aligncenter\"><img decoding=\"async\" src=\"https:\/\/en-images.igihe.com\/jpg\/170614120407_1_900x600.jpg\" alt=\"Shimon, a robot in the Center of Music Technology and School of Music.\" \/><\/figure>\n<p>Source:Science Daily <\/p>\n","protected":false},"excerpt":{"rendered":"<p>{A marimba-playing robot with four arms and eight sticks is writing and playing its own compositions in a lab at the Georgia Institute of Technology. The pieces are generated using artificial intelligence and deep learning.} Researchers fed the robot nearly 5,000 complete songs &#8212; from Beethoven to the Beatles to Lady Gaga to Miles Davis [&hellip;]<\/p>\n","protected":false},"author":8,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[75],"byline":[2491],"hashtag":[],"class_list":["post-35172","post","type-post","status-publish","format-standard","hentry","category-science-technology","tag-homenews","byline-science-daily"],"bylines":[{"id":2491,"name":"SCIENCE DAILY","slug":"science-daily","description":"","image":{"id":0,"url":"https:\/\/secure.gravatar.com\/avatar\/?s=96&d=mm&f=y&r=g","alt":"Default avatar","title":"Default avatar","caption":"","mime_type":"image\/jpeg","sizes":[]},"user_id":null}],"contributors":[{"id":2491,"name":"SCIENCE DAILY","slug":"science-daily","description":"","image":{"id":0,"url":"https:\/\/secure.gravatar.com\/avatar\/?s=96&d=mm&f=y&r=g","alt":"Default avatar","title":"Default avatar","caption":"","mime_type":"image\/jpeg","sizes":[]},"user_id":null}],"featured_image":null,"_links":{"self":[{"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/posts\/35172","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/users\/8"}],"replies":[{"embeddable":true,"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/comments?post=35172"}],"version-history":[{"count":0,"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/posts\/35172\/revisions"}],"wp:attachment":[{"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/media?parent=35172"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/categories?post=35172"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/tags?post=35172"},{"taxonomy":"byline","embeddable":true,"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/byline?post=35172"},{"taxonomy":"hashtag","embeddable":true,"href":"https:\/\/new.igihe.com\/english\/wp-json\/wp\/v2\/hashtag?post=35172"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}