Toggle light / dark theme

Like many things about Elon Musk, Tesla’s approach to achieving autonomous driving is polarizing. Bucking the map-based trend set by industry veterans such as Waymo, Tesla opted to dedicate its resources in pursuing a vision-based approach to achieve full self-driving instead. This involves a lot of hard, tedious work on Tesla’s part, but today, there are indications that the company’s controversial strategy is finally paying off.

In a recent talk, Tesla AI Director Andrej Karpathy discussed the key differences between the map-based approach of Waymo and Tesla’s camera-based strategy. According to Karpathy, Waymo’s use of pre-mapped data and LiDAR make scaling difficult, since vehicles’ autonomous capabilities are practically tied to a geofenced area. Tesla’s vision-based approach, which uses cameras and artificial intelligence, is not. This means that Autopilot and FSD improvements can be rolled out to the fleet, and they would function anywhere.

This rather ambitious plan for Tesla’s full self-driving system has caught a lot of skepticism in the past, with critics pointing out that map-based FSD is the way to go. Tesla, in response, dug its heels in and doubled down on its vision-based initiative. This, in a way, resulted in Autopilot improvements and the rollout of FSD features taking a lot of time, particularly since training the neural networks, which recognize objects and driving behavior on the road, requires massive amounts of real-world data.

When the next Ford F-150 arrives on American roads, you’ll recognize it immediately even if you can’t see the emblem on its grille. The company published a preview image that reveals the truck’s LED lighting signature.

Posted on Twitter, the blacked-out photo is our first official look at the next-generation F-150 due out for the 2021 model year. It confirms the front end receives two pairs of LEDs that create the outline of a rectangle when lit. The top bars frame the headlights and stretch into the grille, while the lower bars underline the fog lights.

Our spies have regularly sent us images of camouflaged F-150 test mules taken all over the United States, so we have a decent idea of what to expect from the truck, and the preview image reveals nothing that we don’t already know. It wears a tall hood with sculpted sides, vertical headlights, and rectangular mirrors. Its design is more of an evolution than a revolution, but Ford hinted it’s making significant changes under the body panels.

When opportunity knocks, open the door: No one has taken heed of that adage like Nvidia, which has transformed itself from a company focused on catering to the needs of video gamers to one at the heart of the artificial-intelligence revolution. In 2001, no one predicted that the same processor architecture developed to draw realistic explosions in 3D would be just the thing to power a renaissance in deep learning. But when Nvidia realized that academics were gobbling up its graphics cards, it responded, supporting researchers with the launch of the CUDA parallel computing software framework in 2006.

Since then, Nvidia has been a big player in the world of high-end embedded AI applications, where teams of highly trained (and paid) engineers have used its hardware for things like autonomous vehicles. Now the company claims to be making it easy for even hobbyists to use embedded machine learning, with its US $100 Jetson Nano dev kit, which was originally launched in early 2019 and rereleased this March with several upgrades. So, I set out to see just how easy it was: Could I, for example, quickly and cheaply make a camera that could recognize and track chosen objects?

Embedded machine learning is evolving rapidly. In April 2019, Hands On looked at Google’s Coral Dev AI board which incorporates the company’s Edge tensor processing unit (TPU), and in July 2019, IEEE Spectrum featured Adafruit’s software library, which lets even a handheld game device do simple speech recognition. The Jetson Nano is closer to the Coral Dev board: With its 128 parallel processing cores, like the Coral, it’s powerful enough to handle a real-time video feed, and both have Raspberry Pi–style 40-pin GPIO connectors for driving external hardware.

How can we train self-driving vehicles to have a deeper awareness of the world around them? Can computers learn from past experiences to recognize future patterns that can help them safely navigate new and unpredictable situations?

These are some of the questions researchers from the AgeLab at the MIT Center for Transportation and Logistics and the Toyota Collaborative Safety Research Center (CSRC) are trying to answer by sharing an innovative new open dataset called DriveSeg.

Through the release of DriveSeg, MIT and Toyota are working to advance research in autonomous driving systems that, much like , perceive the driving environment as a continuous flow of visual information.

“Beam me up” is one of the most famous catchphrases from the Star Trek series. It is the command issued when a character wishes to teleport from a remote location back to the Starship Enterprise.

While human teleportation exists only in , teleportation is possible in the subatomic world of quantum mechanics—albeit not in the way typically depicted on TV. In the , teleportation involves the transportation of information, rather than the transportation of matter.

Last year scientists confirmed that information could be passed between photons on even when the photons were not physically linked.

:ooooo.


As e-bikes continue to explode in popularity and cause a nationwide shortage of the two-wheeled transports, waves of new riders are discovering the benefits of pedaling their way to work without breaking a sweat. And the ability to distance oneself from other commuters in crowded public transportation certainly sweetens the deal.

But many more are discovering a significant downside to bikes, which is that riders aren’t protected from the weather, be it heavy rain or the glaring sun. That’s a problem that CityQ is hoping to solve with its enclosed electric vehicle known as the CityQ Car-eBike.

A Tesla Model 3 has been modified with a solar roof as part of Lightyear’s solar car development program.

We have been reporting on Lightyear for a few years now.

The startup first caught our attention because it spun out of Solar Team Eindhoven, a group of engineering students from the Technical University of Eindhoven (Netherlands) who have been competing in the World Solar Challenge with their Stella and Stella Lux, energy positive solar cars — meaning that they can produce more energy than they consume.

Daimler has announced an upcoming new truck called eEconic, a garbage truck based on the all-electric Mercedes-Benz eActros.

The German automotive company announced the vehicle today:

“The eEconic will at first be offered in the configuration 6×2/N NLA and is mainly in demand as a waste-collection vehicle. Battery-electric trucks are very well suited for urban use in waste management due to the comparatively short and plannable daily routes of up to 100 kilometers with a high proportion of stop-and-go in inner-city traffic. With an anticipatory driving style, electrical energy can be recovered during braking to charge the battery, which further improves range and efficiency.”

“We have also partnered with Texas EquuSearch and the National Center for Missing and Exploited Children to tap into their resources as well. We have participated in ground and air searches on Fort Hood and throughout the central Texas region.” Grey said.

The soldier was last seen between 11:30 a.m. and 12:30 p.m. April 22 in the parking lot of 3rd Cavalry Regiment’s engineer squadron headquarters, where she worked in the armory room. Her car keys, barracks room key, identification card and wallet were later found there.