Tesla CEO Elon Musk is planning on opening his own STEM-focused primary and secondary school, and eventual an university in Austin, Texas, according to tax filings.
Future quantum computers are expected to revolutionize problem-solving in various fields, such as creating sustainable materials, developing new medications, and unraveling complex issues in fundamental physics. However, these pioneering quantum systems are currently more error-prone than the classical computers we use today. Wouldn’t it be nice if researchers could just take out a special quantum eraser and get rid of the mistakes?
Reporting in the journal Nature, a group of researchers led by Caltech is among the first to demonstrate a type of quantum eraser. The physicists show that they can pinpoint and correct for mistakes in quantum computing systems known as “erasure” errors.
“It’s normally very hard to detect errors in quantum computers, because just the act of looking for errors causes more to occur,” says Adam Shaw, co-lead author of the new study and a graduate student in the laboratory of Manuel Endres, a professor of physics at Caltech. “But we show that with some careful control, we can precisely locate and erase certain errors without consequence, which is where the name erasure comes from.”
Artificial Intelligence (AI) is changing.
AI is changing and it will continue to do so — although this year of generative AI stands out as a seminal moment in time — now we need the right developer tools.
Bad things can happen when you hallucinate. If you are human, you can end up doing things like putting your underwear in the oven. If you happen to be a chatbot or some other type of artificial intelligence (AI) tool, you can spew out false and misleading information, which—depending on the info—could affect many, many people in a bad-for-your-health-and-well-being type of way. And this latter type of hallucinating has become increasingly common in 2023 with the continuing proliferation of AI. That’s why Dictionary.com has an AI-specific definition of “hallucinate” and has named the word as its 2023 Word of the Year.
Dictionary.com noticed a 46% jump in dictionary lookups for the word “hallucinate” from 2022 to 2023 with a comparable increase in searches for “hallucination” as well. Meanwhile, there was a 62% jump in searches for AI-related words like “chatbot”, “GPT”, “generative AI”, and “LLM.” So the increases in searches for “hallucinate” is likely due more to the following AI-specific definition of the word from Dictionary.com rather than the traditional human definition:
hallucinate [ h uh-loo-s uh-neyt ]-verb-(of artificial i ntelligence) to produce false information contrary to the intent of the user and present it as if true and factual. Example: When chatbots hallucinate, the result is often not just inaccurate but completely fabricated.
Here’s a non-AI-generated new flash: AI can lie, just like humans. Not all AI, of course. But AI tools can be programmed to serve like little political animals or snake oil salespeople, generating false information while making it seem like it’s all about facts. The difference from humans is that AI can churn out this misinformation and disinformation at even greater speeds. For example, a study published in JAMA Internal Medicine last month showed how OpenAI’s GPT Playground could generate 102 different blog articles “that contained more than 17,000 words of disinformation related to vaccines and vaping” within just 65 minutes. Yes, just 65 minutes. That’s about how long it takes to watch the TV show 60 Minutes and then make a quick uncomplicated bathroom trip that doesn’t involve texting on the toilet. Moreover, the study demonstrated how “additional generative AI tools created an accompanying 20 realistic images in less than 2 minutes.
It features a sleek aerodynamic design, no tilt mechanisms or extra parts, eliminating failures and unnecessary weight for efficient vertical flight.
The innovative aerial solution is ideal for high-precision mapping to agriculture, ensuring efficient coverage in every flight.
NASA has awarded GE Aerospace a contract to continue working on the HyTEC program, which aims to create more fuel-efficient and eco-friendly engines for commercial aircraft.
NASA has renewed a contract with GE Aerospace to continue the HyTEC program, which aims to create more fuel-efficient engines for commercial aircraft.
According to a new study in Nature, China could be facing an impending wind turbine waste problem if it cannot find ways to deal with it.
While China has forged ahead with its renewable energy infrastructure, it must find ways to deal with the waste from older installations, study finds.
Discover the financial and environmental benefits of Tesla’s Powerwall with solar, claiming to be 20% cheaper over a decade than traditional backup generators.
Unleash savings with Tesla’s Powerwall and solar combo, a green powerhouse that beats backup generators by 20% over 10 years.
Researchers call it the ‘Holy Grail’ for physicists and engineers.
A group of researchers, led by Professor Chan Chi-hou from the City University of Hong Kong, created a special antenna that can control all five important aspects of electromagnetic waves using computer software.
The antenna, which they have named ’microwave universal metasurface antenna,’ is capable of dynamically, simultaneously, independently, and precisely manipulating all the essential properties of electromagnetic waves through software control.
“A universal component capable of manipulating all the fundamental wave properties is the Holy Grail for physicists and engineers,” said Professor Chan.
Researchers from the RAND Corporation — which took more than $15 million this year from a group financed by a Facebook co-founder — were a driving force behind the White House’s sweeping new AI reporting requirements.