Apple is playing catch-up, but it should take note.

Stephanie Song, formerly on the corporate development and ventures team at Coinbase, was often frustrated by the volume of due diligence tasks she and her team had to complete on a daily basis.
Dili is a new, Y Combinator-backed startup that aims to automate different aspects of the due diligence process using AI.
Although I like to write about future predictions for the world of technology and business, I’m usually focused on what’s coming up in the next five years.
But it’s also worth taking a longer view. I believe that in 10 years’ time, the AI that’s a part of everyday life will be as far evolved from today’s AI as today’s internet is from the internet of the early days.
In his excellent book The Coming Wave, Mustafa Suleyman notes that every wave of technology-driven change – from the combustion engine to the internet – has revolutionized society in a shorter time span than the previous wave. So, I don’t think we’ll have to wait 30 or even 20 years until AI is utterly engrained in all aspects of life.
Instead, let’s look ahead just 10 years to 2034. Halfway through the next decade, a lot will have changed, but what will AI look like? Here’s what I think!
EpiSci has won a $1.6 million contract with the US Space Development Agency to flesh out its AI-powered, hypersonic missile tracking system.
The SDA and EpiSci have partnered to create AI software that detects hypersonic missiles using satellite technology, AI, and advanced analytics.
The new model is called Big Adaptive Streamable TTS with Emergent abilities — BASE TTS.
In what is being called the largest text-to-speech model ever developed, researchers at Amazon AGI have made waves after creating the Big Adaptive Streamable TTS with Emergent abilities (BASE TTS).
Text-to-Speech (TTS) models are used in the development of voice assistants for smart devices and are employed to convert written text into spoken words, allowing voice assistants to communicate with users in a natural and human-like manner.
Furthermore, TTS models produce outputs that closely resemble natural speech, incorporating elements such as intonation, emphasis, and inflection.
In an age of increasingly advanced robotics, one team has well and truly bucked the trend, instead finding inspiration within the pinhead-sized brain of a tiny flying insect in order to build a robot that can deftly avoid collisions with very little effort and energy expenditure.
An insect’s tiny brain is an unlikely source of biomimicry, but researchers from the University of Groningen in the Netherlands and Bielefeld University in Germany believed it was an ideal system to apply to how robots move. Fruit flies (Drosophila melanogaster) possess remarkably simple but effective navigational skills, using very little brainpower to swiftly travel along invisible straight lines, then adjusting accordingly – flying in a line angled to the left or the right – to avoid obstacles.
With such a tiny brain, the fruit fly has limited computational resources available to it while in flight – a biological model, the scientists believed, that could be adapted to use in the ‘brain’ of a robot for efficient, low-energy and obstacle-avoiding locomotion.