Toggle light / dark theme

In June 2022, Amazon re: MARS, the company’s in-person event that explores advancements and practical applications within machine learning, automation, robotics, and space (MARS), took place in Las Vegas. The event brought together thought leaders and technical experts building the future of artificial intelligence and machine learning, and included keynote talks, innovation spotlights, and a series of breakout-session talks.

Now, in our re: MARS revisited series, Amazon Science is taking a look back at some of the keynotes, and breakout session talks from the conference. We’ve asked presenters three questions about their talks, and provide the full video of their presentation.

On June 24, Alexa AI-Natural Understanding employees Craig Saunders, director of machine learning, and Devesh Pandey, principal product manager, presented their talk, “Human-like reasoning for an AI”. Their presentation focused on how Amazon is developing human-like reasoning for Alexa, including how Alexa can automatically recover from errors such as recognizing “turn on lights” in a noisy environment (instead of “turn off lights”) when the lights are already on.

GPT Chat is a large language model trained by OpenAI, its function is to assist users in generating human-like text based on the input provided to it. It can assist with a wide range of tasks, such as answering questions, providing explanations, and generating original text. It’s designed to generate natural-sounding text, and it’s constantly learning and improving. It’s able to process and generate text at scale, making it a powerful tool for natural language processing and generation. It’s ultimate goal is to make it easier for people to interact with computers and access information using natural language.

Give it a try: https://openai.com/blog/chatgpt/

I had GPT Chat rewrite an article… More.


We’ve trained a model called which interacts in a conversational way. The dialogue format makes it possible for to answer followup questions, admit its mistakes, challenge incorrect premises, and reject inappropriate requests. is a sibling model to InstructGPT, which is trained to follow an instruction in a prompt and provide a detailed response.

Standing among solar arrays and power grid equipment at the National Renewable Energy Laboratory (NREL), you might hear a faint, distorted melody buzzing from somewhere. You are not hallucinating—that gray box really is singing the Star Wars Theme, or the ice cream truck song, or Chopin’s Waltz in A minor. Power system engineers are just having some fun with an NREL capability that prevents stability problems on the electrical grid.

Usually, the engineers send another kind of waveform through the inverters and load banks: megawatts of power and voltage vibrations at many frequencies. The purpose of their research is to see how and the grid interact—to get them “in tune” and prevent dangerous electrical oscillations that show up like screechy feedback or a booming sub-bass.

The engineers can do this analysis at with NREL hardware using the lab’s advanced impedance measurement system, and they have also produced a commercially available software called the Grid Impedance Scan Tool or GIST that can do the same with simulated power on device models, allowing any manufacturer or grid operator to certify grid with renewable energy resources.