Last month I covered an incredible AI use case - talking to the dead using the digital footprint they leave behind. Analyzing texts, posts, photos, audio and video messages, we can train a bot to mimic one’s behavior, and we can bring back the loved ones from graveyards to the digital existence. It may sound creepy and might raise ethical questions about the posthumous use of technology, but your loved ones may find these bots ease their pain.
An algorithm wrote the entire movie script by itself
Oscar Sharp, BAFTA-nominated filmmaker and Screen International Star of Tomorrow; and Ross Goodwin, an NYU AI researcher collaborated on creating an AI chat bot to write movie scripts. They originally named the algorithm as Jetson, but the bot later came to call itself Benjamin, wrote the entire script of Sunspring movie. The screenplay, stage directions, and dialogues are generated by world’s first automatic screenwriter Benjamin. It was trained to write science fiction screenplays by feeding it with a bunch of science fiction movie screenplays from the 1980s and 90s. This experimental science fiction script was later made into a short film by Sharp and was broadcasted online by Ars Technica, and it called the movie as a tale of romance and murder set in a dark future world. Benjamin, an LSTM RNN (Long Short Term Memory Recurrent Neural Network) composed a pop song for the movie after learning from 30,000 other pop songs.
IBM Watson creates the first movie trailer using AI
20th Century Fox collaborated with IBM Watson scientists to cut the theoretical trailer for their thriller movie Morgan in August 2016. They trained the Watson with 100 horror movie trailers, and then Watson performed visual, audio and composition analysis to understand how each scene is put together to make the trailer effective and more importantly what people find scary in a trailer when used with the right background score. After the algorithm had been trained with the existing trailers, they fed the entire Morgan movie to find the right scenes to include in the final trailer. It immediately picked the right scenes and cut the trailer instantaneously. However, in the end, a human editor still had to make sure the trailer looks good as this is the experiment. Including human editor involvement, the final trailer was cut in less than 24 hours. Impressed with the outcome, 20th Century Fox used it as a theatrical trailer. Trailers on average take 10 to 30 days to cut the final version.
An algorithm can detect rogue employees before they do
In the movie Minority Report, Tom Cruise played a role of a future crime officer who arrests a guy before they commit a crime based on a system which predicts the criminals. The question is, how can you arrest a man without committing a crime, that too based on information provided by a system. The same concept was taken to the next level in NBC series Person of Interest, where Harold Finch, a mysterious billionaire receives a message about a single person predicted to be the crime victim or perpetrator. He hires John Reese, a presumed-dead former CIA agent, to save the victim or arrest the perpetrator while the crime happens. It was one of the well written AI scripts which address a lot of questions raised by Minority Report suspect.
JP Morgan Chase & Co. paid more than $36 billion in legal bills since 2010. In 2013 alone, they spent $12+ billion, and now they are investing in an algorithm to reduce their legal bills. JP Morgan is spending $750 million on this surveillance program, and it monitors dozens of inputs including the workers skipping compliance classes, violating personal trading rules, breaching market risk limits. It analyses emails, IM conversations, and phone transcripts to determine if employees are going to cause legal issues. According to Tim Estes, chief executive officer of Digital Reasoning Systems Inc., as quoted in Bloomberg, "We're taking technology that was built for counter-terrorism and using it against human language because that's where intentions are shown. If you want to be proactive, you have to get people before they act."
Teaching Assistant bot
Ashok Goel, College of Computing professor at Georgia Tech, teaches KBAI (Knowledge-Based Artificial Intelligence) class every semester and roughly 300 students takes his course and end up posting 10,000+ messages in the online forums. In Spring 2016, he added a new TA (teaching assistant) Jill Watson to answer the questions in the forums. Jill Watson is a bot created using IBM Watson technologies and they fed all the messages that had ever been posted in KBAI class since the class was launched in fall 2014. With roughly 40,000 messages and with some tinkering by the research team after the initial phase, soon the bot was answering questions with 97 percent certainty. Initially, the responses were moderated by human TAs, but later it didn’t need any assistance. Students who didn’t know about this chat bot were asked to provide their feedback on Jill Watson, and their response was uniformly positive.
With extensive data collection and massive parallel processing, we can open the doors to many machine learning use cases where the systems can learn by themselves without any human intervention. Much more AI use cases to come in this (Era of Tera) column - stay tuned! |
Hari Gottipati is a tech evangelist based out of the Valley. Opinions expressed here are solely his own and do not express the views or opinions of his employer. His quotes can often be found in various technology magazines, GigaOM, CNN Money, WSJ, Bloomberg BusinessWeek, etc. Follow him on Twitter/Facebook/Linkedin@harigottipati | |