Toggle light / dark theme

Tesla is preparing to launch an affordable vehicle and a robo-taxi service, highlighted by the upcoming Project Alicorn software update and the new Model Y long-range, aimed at enhancing user experience and meeting market demands ## ## Questions to inspire discussion ## Tesla’s New Affordable Vehicle.

🚗 Q: What are the key features of Tesla’s upcoming affordable vehicle? A: Expected to launch in first half of 2024, it will be a lower, more compact version of the Model Y, possibly a hatchback, with a starting price of $44,990 in the US.

🏎️ Q: How does the new rear-wheel drive Model Y compare to previous models? A: It offers 20 miles more range, faster 0–60 time, and all-new features like improved speakers and sound system, making it a bargain at $44,990. Robotaxi Functionality.

🤖 Q: What is Tesla’s robotaxi project called and what features will it have? A: Called Project Alicorn, it will allow users to confirm pickup, enter destination, fasten seatbelt, pullover, cancel pickup, and access emergency help.

📱 Q: What additional features are coming to the robotaxi app? A: Upcoming features include smart summon without continuous press, live activities, trip summary screen, ability to close the trunk, rate the ride, and access outside service area help.

🚕 Q: How might Tesla expand its robotaxi service to non-driverless markets? A: The app includes a “call driver” button, potentially allowing non-driverless markets to join the ride-share network, though this strategy is unclear. CyberCab Production.

Such questions quickly run into the limits of knowledge for both biology and computer science. To answer them, we need to figure out what exactly we mean by “information” and how that’s related to what’s happening inside cells. In attempting that, I will lead you through a frantic tour of information theory and molecular biology. We’ll meet some strange characters, including genomic compression algorithms based on deep learning, retrotransposons, and Kolmogorov complexity.

Ultimately, I’ll argue that the intuitive idea of information in a genome is best captured by a new definition of a “bit” — one that’s unknowable with our current level of scientific knowledge.

The operation of quantum technologies relies on the reliable realization and control of quantum states, particularly entanglement. In the context of quantum physics, entanglement entails a connection between particles, whereby measuring one determines the result of measuring the other even when they are distant from each other, and in a way that defies any intuitive explanation.

A key challenge in the development of reliable quantum technologies is that entanglement is highly susceptible to noise (i.e., random interactions with the environment). These interactions with noise can adversely impact this desired quantum state of affairs and, in turn, reduce the performance of quantum technologies.

Researchers at Shandong University in China and National Cheng Kung University in Taiwan recently implemented a key step to experimentally recover hidden quantum correlations from higher-dimensional entangled states.

Buried deep in the ice in the Antarctic are “eyes” that can see elementary particles called neutrinos, and what they’ve observed is puzzling scientists: a remarkably strong neutrino signal accompanied by a surprisingly weak gamma-ray emission in the galaxy NGC 1068, also known as the Squid galaxy.

A human clearing junk out of an attic can often guess the contents of a box simply by picking it up and giving it a shake, without the need to see what’s inside. Researchers from MIT, Amazon Robotics, and the University of British Columbia have taught robots to do something similar.

They developed a technique that enables robots to use only internal sensors to learn about an object’s weight, softness, or contents by picking it up and gently shaking it. With their method, which does not require external measurement tools or cameras, the can accurately guess parameters like an object’s mass in a matter of seconds.

This low-cost technique could be especially useful in applications where cameras might be less effective, such as sorting objects in a dark basement or clearing rubble inside a building that partially collapsed after an earthquake.

Flexcompute’s Flow360, the most trusted GPU-native CFD solution for advanced aviation, accelerates the aerospace design process by optimizing evaluations, enhancing aerodatabase development, and reducing time-to-market while ensuring compliance with regulatory requirements. In collaboration with OEMs, companies like JetZero are using Flow360 to push the boundaries of efficiency and sustainability, advancing revolutionary designs such as blended-wing-body (BWB) aircraft, hydrogen-powered, and advanced propulsion models. This strategic partnership is crucial to transforming air travel and achieving global sustainability goals, accelerating the next era of aviation innovation.

In this webinar, hear from Qiqi Wang on the latest advancements in high-fidelity CFD, joined by John Vassberg, Chief Design Officer at JetZero, as they explore the cutting-edge technologies driving the future of aviation. They will discuss how GPU-powered CFD is enabling faster, more sustainable aircraft design and how strategic collaboration is key to realizing the industry’s ambitious goals.

A team of researchers at Nagoya University has discovered something surprising. If you have two tiny vibrating elements, each one barely moving on its own, and you combine them in the right way, their combined vibration can be amplified dramatically—up to 100 million times.

The paper is published in the Chaos: An Interdisciplinary Journal of Nonlinear Science.

Their findings suggest that by relying on structural amplification rather than power, even small, simple devices can transmit long-distance clear signals, potentially innovating long-distance communications and remote medical devices.