It’s the end of the year, and as expected, the swirl of articles predicting what will happen in 2023 are filling up our social feeds and inboxes. After wading through clairvoyant claims about the next hottest thing in AI, we found our inner Grinch surfacing just in time for the holidays. For this special end-of-year edition of our newsletter, we are poopin’ on parades and kicking over crystal balls. Here is what we’re betting won’t happen in the coming year:
ChatGPT isn’t going to replace Google.
Instead of typing in a search term and receiving a long list of results, why not just have a conversation with an AI chatbot who knows how to tailor results to your specific needs? Because that chatbot doesn’t actually understand your needs– yet. ChatGPT is still a black-box tool that will need a whole lot more training and sophistication to provide the level of transparency and accuracy of Google’s search today. While we agree that conversational AI has a bright future, addressing the issue of accuracy in a way that is both scalable and robust will be a major challenge for search innovators beyond just 2023. In the meantime, we’ll continue having fun debugging our code via ChatGPT and asking it what we should make for dinner.
Fully driverless cars won’t be taking over your morning commute.
You might have recently shared the road in San Francisco with a Waymo or GM’s Cruise, or even been added to Cruise’s beta pool of users from 10 PM to 6 AM– but fully driverless cars aren’t ready for any kind of omnipresent scale. Despite the fact that Cruise received approval from the California DMV to offer driverless rides free of time or location constraints, it will only be allowed to do so in areas with posted speed limits up to 35 mph. While we await the final approval needed from the California Public Utilities Commission for Cruise’s robo-taxis to become mainstream, we can’t ignore the fact that the technology itself is still not up to par. As of today, these vehicles still can’t handle weather conditions worse than partly cloudy and have major difficulty navigating construction zones, animals, traffic cones, crossing guards, and what the industry calls “unprotected left turns”. Point is– autonomous vehicles have a long way to go to beat us at driving and it seems highly unlikely to change at a pace that would make them ubiquitous in any reasonable time frame.
Stable Diffusion won’t go beyond artistic use cases.
Just as in the real world, psychedelic visuals are fantastic when you want them and absolutely horrific when you don’t. Using Stable Diffusion for a medical use case will take some time and likely won’t happen this year. Foundational models trained in natural images and language have notoriously not performed well when given domain-specific tasks. This is partly due to the fact that professional fields such as medicine and finance have their own jargon, terminology, and rules, which are not accounted for in general training datasets. While medical researchers are hoping that models like Stable Diffusion can help alleviate the gap in training data, the challenge of measuring clinical accuracy will likely take a bit longer to achieve meaningful adoption.
The fight between traditional artists and AI won’t be resolved this year.
The term “AI vegan” has officially entered the digital lexicon following mass protests against AI-generated artwork this year. It is now well-known that the dataset used to train Stable Diffusion, LAION-5B, pulls images from all over the web without copyright permission or artist consent. With the general public now involved in the debate about data, intellectual property, and identity in AI– the pressure for these models to listen will only continue to mount. For starters, Stability AI plans to let artists opt out of Stable Diffusion 3 image training and it’s only a matter of time for other models to follow suit.
The days of unregulated AI are coming to an end.
With the Whitehouse’s AI Bill of Rights and AI bills introduced in 17 states in 2022– it’s clear that the regulation wave is upon us. While several prominent figures like Open AI have advocated for industry self-regulation in the absence of governmental regulations, many experts believe is it not a long-term solution and that government regulation is essential. Whether or not you agree, widely applied AI-related regulations are just a matter of time and will soon go beyond autonomous vehicles and defense-related use cases where much of the legislative focus to date has been.
Want to learn more from Lightning AI? “Subscribe” to make sure you don’t miss the latest flashes of inspiration, news, tutorials, educational courses, and other AI-driven resources from around the industry. Thanks for reading!