New Post 9-27-2023

Top Story

Now ChatGPT can hear, see, and speak!

On Monday, OpenAI announced that ChatGPT has gone multimodal - able to respond to speech and images, as well as text. And it can talk back to you!

  • With the phone app, you can have back and forth conversations with ChatGPT, like Siri on steroids.

  • You can upload images to the app, and have a conversation about it, like asking for recipes that use the contents of your pantry, or having it analyze the significance of a graph or chart.

The new multimodal features will be available to Plus and Enterprise subscribers in the next 2 weeks.

Clash of the Titans

Microsoft to use micro-nukes to power its data centers

Bill Gates has been a big fan of nuclear power since he was a teen. In 2008 he founded Terra Power, which calls itself a “nuclear innovation company”, which designs and tests next generation designs for smaller, safer nuclear power plants. And now he wants to use small onsite nuclear power plants to supply the ravenous energy needs of cloud data centers.

Fissile materials next to a giant server farm running leading-age AI systems - what could go wrong???

Amazon outbids Google for Claude - by $4 billion

Just a few short weeks ago, Google thought it had Anthropic’s Claude LLM all sewed up to use Google cloud services in exchange for a sweet $300 million investment. But in AI World, that’s chump change. Amazon, who makes way more profits from its AWS cloud service than from all of its online marketplace and Whole Foods upscale groceries, bought Anthropic’s affections with a $4 billion (with a B) investment. So, whoops, Google is out and Amazon is in as Claude’s primary cloud provider. And Amazon gets preferred access to Claude’s state-of-the-art LLM for its own AI projects, moving it from AI laggard to the new kid in AI-town.

Fun News

AI steps into soccer cleats

International soccer governing organizations FIFA and IFAB have approved AI foot-tracking technology for use in official matches.

Playermaker, a self-styled maker of “smart shoe technology”, has developed strap-on sensors that can track a player’s moment-to-moment position, direction, and acceleration during game play. This can help players, along with their coaches and trainers, to analyze their performance for correctable flaws. And when all of the players on a team are wearing the tech, it gives the coach a “God’s eye” view of the mechanics of the game in real time.

With AI, will everyone be above average?

Wharton Business School Professor Ethan Mollick writes about the implications of a recent study that showed that AI can improve performance of knowledge workers by 40%. (We reported on this last week - the link to the article is here.) But AI helped poor performers more, “leveling up” their achievements to approach that of the higher-performing individuals. Mollick asks the question - if this holds true generally, will AI obliterate performance gaps? 

Extreme parkour with 4-legged autonomous robots

Researchers at Carnegie Mellon University have developed cheap 4-legged autonomous robots that use AI to navigate real world environments despite being built from low-precision off-the-shelf parts and a single shaky front-facing camera. Here they share a video of their robo-pets as they skitter, clamber, and jump on and over a variety of obstacles.

AI voice-cloned cover of Lana Del Rey’s “Young and Beautiful”

Voice cloning technology has gotten so good that the synthesized voices are difficult to tell from the real ones. Here is an AI voice-cloning cover of Lana Del Rey’s “Young and Beautiful” featuring Frank Sinatra, Rihanna, Freddie Mercxury, Kurt Cobain, and more.

Paper Chase

Paper: Deception abilities emerge in LLMs

State-of-the-art LLMs apparently have developed an emergent ability to deceive, that was not present in earlier models. Researchers tested several LLMs of various complexities, and found that the most recent and powerful models, such as GPT-4, have a surprising ability to practice deception when suitably prompted. This is an ability that simply does not exist in prior models, but “emerged” with the additional complexity of the recent SOTA models. The authors consider whether this ability can be further improved, as well as how to build safeguards against deception into future LLMs.

Pinocchio-bot

Paper: Chain of Density prompting

Researchers from Salesforce, MIT, and Columbia University have developed a new, powerful prompting method to generate concise summaries of long transcripts or other lengthy text works.

LLMs are in general good at summarizing, but simple prompts tend to produce “vanilla” summaries that are not optimally concise. The research team developed an iterative prompting method for GPT-4, whereby the LLM was prompted to make ever more concise and information-dense summaries from an original text, by distilling each successive response yet again, up to a maximum of 5 iterations.

Human raters evaluated the various responses, and tended to prefer the concise but complete summaries that emerged after 3 too 5 iterations of the prompting method. The authors believe that this “Chain of Density” iterative prompting method may be a powerful tool for generating optimally concise summaries.

AI in Medicine

Autonomous robot needle navigates through living lung tissue

Scientists at the University of North Carolina have developed an autonomous steerable robot needle that can navigate living lung tissue to find and sample possible tumors that are too difficult to safely and accurately biopsy with conventional means.

UNC researchers used CT scans of the subject’s thoracis cavity to develop a 3-D model of the lung and all its important components, such as airways, blood vessels, and the chosen target. They then used AI to steer the needle to the target without damaging nearby vital structures.

This research is likely to have near-term utility in improving the ability of physicians to biopsy tissue in difficult-to-reach locations, and it also points toward a future when autonomous bots will be able to roam through the body to resect or repair a variety of internal organs.

Google and DoD build an AI microscope to detect cancer

Google is collaborating with the Department of Defense to bring AI to the pathology lab, helping pathologists improve their ability to diagnose cancer.

A conventional optical microscope is fitted with computer connections that allow AI to overlay the pathologist’s view with pointers and flags that highlight suspicious areas of the tissue being studied. The current goal of the project is to assist pathologists working in remote locations, like many US military installations around the world, to quickly get assistance with difficult diagnoses.

That's a wrap! More news next week.