Toggle light / dark theme

Really bad now. BUT, the future of entertainment industry:


Transphobic comments lead to Larry Feinberg’s downfall on Twitch. According to the host, the reason for Feinberg’s bias is an outdated OpenAI language model without a functioning moderation system.

Since mid-December 2022, the small media group Mismatch Media has been running one of the most unusual shows on Twitch (and that’s saying something): Using AI tools like DALL-E, GPT-3, Stable Diffusion, and more, Mismatch Media broadcasts an AI-generated show inspired by the popular U.S. sitcom “Seinfeld” every day, around the clock. “Nothing, Forever” is the name the team has given to their art project.

The AI-generated content is stitched together in the Unity engine to create an audiovisual pixel show reminiscent of early ’90s video games. The jokes rarely have punchlines, the conversations are empty and incoherent, and the audience’s fake applause starts in the wrong places.

https://youtu.be/tiAw2gSesoM

Thank you for watching my video about Elon Musks Neuralink! If you liked it, please consider subscribing! Have a great day. #neuralink #elonmusk.

Neuralink is a neurotechnology company founded by Elon Musk in 2016 with the goal of.
merging the human brain with artificial intelligence. The company aims to develop a.
brain-machine interface that will enable humans to communicate with computers and other.
devices directly through their thoughts. Neuralink’s ultimate vision is to create a symbiotic.
relationship between humans and AI, where the brain and the computer work together to.
enhance human capabilities. While there is a huge potential in this field, it could also turn out.
to be extremely dangerous. Here’s why.

Jim Cantrell is an entrepreneur, strategist, subject matter expert in satellite systems, space system markets and road racer. Founder of StratSpace, Founder of Vintage Exotics Competition Engineering, early partner and VP at SpaceX.

Links.
https://en.wikipedia.org/wiki/Jim_Cantrell.
https://twitter.com/jamesncantrell?lang=en.
https://www.goodreads.com/author/show/18894007.Jim_Cantrell.
http://www.jimcantrell.com/book (new book)

PODCAST INFO:
The Learning With Lowell show is a series for the everyday mammal. In this show we’ll learn about leadership, science, and people building their change into the world. The goal is to dig deeply into people who most of us wouldn’t normally ever get to hear. The Host of the show – Lowell Thompson-is a lifelong autodidact, serial problem solver, and founder of startups.

LINKS
Youtube: https://www.youtube.com/channel/UCzri06unR-lMXbl6sqWP_-Q
Youtube clips: https://www.youtube.com/channel/UC-B5x371AzTGgK-_q3U_KfA
Linkedin: https://www.linkedin.com/in/lowell-thompson-2227b074
Twitter: https://twitter.com/LWThompson5
Website: https://www.learningwithlowell.com/
Shownotes/ Timestamps.
00:00 Intro.
00:30 Book launch.
01:30 Miniseries.
02:15 Audiobook.
04:15 Self publishing.
05:26 Andy Weir / Conferences.
06:25 Carl Sagan simile.
15:02 SpaceX and Blue Origin, Jeff Bezos and Elon Musk.
21:17 Jeff Bezos Blue Origin True Origin story.
26:08 Jeff Bezos Accountability.
31:11 SpaceX Raptor 2 engines design compared to Phantom space engines.
34:45 Engines and ksp 2
37:25 Assets of being engineering leader.
39:37 Testing.
40:55 Duel welding SpaceX, Phantom Space.
43:25 Hiring Right.
47:10 Snap On Tools.
49:55 Tools.
51:11 American vs German Engineering.
53:58 Custom tools.
56:18 Soundtrack for book.
58:49 Childhood stories.
1:04:16 Innovation, exploration, Mars balloon.
1:06:01 Race Car’s Soul.
1:08:20 Car’s personality keyed to individual or?
1:12:45 Speeding tickets.
1:18:19 Cars as stray cats and designing his own car.
1:21:35 Train for racing.
1:23:56 Racing training machines.
1:25:02 Goals for racing.
1:25:34 AI/ML in racing.
1:26:55 Books.
1:28:55 Staying up to date with Jim Cantrell.
1:29:42 Elon Musk Twitter deal.
1:32:31: Apple and privacy.
1:33:53 Final thoughts on book.

#racing #spacex #elonmusk

In vitro biological neural networks (BNNs) interconnected with robots, so-called BNN-based neurorobotic systems, can interact with the external world, so that they can present some preliminary intelligent behaviors, including learning, memory, robot control, etc.

This work aims to provide a comprehensive overview of the intelligent behaviors presented by the BNN-based neurorobotic systems, with a particular focus on those related to robot intelligence.

In this work, we first introduce the necessary biological background to understand the 2 characteristics of the BNNs: nonlinear computing capacity and network plasticity. Then, we describe the typical architecture of the BNN-based neurorobotic systems and outline the mainstream techniques to realize such an architecture from 2 aspects: from robots to BNNs and from BNNs to robots.

The financial industry’s response to artificial intelligence has been all over the place. Now, Bank of America is weighing in very much on the side of the bots.

In a note to clients viewed by CNBC and other outlets, BofA equity strategist Haim Israel boasted that AI was one of its top trends to watch — and invest in — for the year, and used all kinds of hypey language to convince its clients.

“We are at a defining moment — like the internet in the ’90s — where Artificial Intelligence (AI) is moving towards mass adoption,” the client note reads, “with large language models like ChatGPT finally enabling us to fully capitalize on the data revolution.”

Researchers have developed a new model inspired by recent biological discoveries that shows enhanced memory performance. This was achieved by modifying a classical neural network.

Computer models play a crucial role in investigating the brain’s process of making and retaining memories and other intricate information. However, constructing such models is a delicate task. The intricate interplay of electrical and biochemical signals, as well as the web of connections between neurons and other cell types, creates the infrastructure for memories to be formed. Despite this, encoding the complex biology of the brain into a computer model for further study has proven to be a difficult task due to the limited understanding of the underlying biology of the brain.

Researchers at the Okinawa Institute of Science and Technology (OIST) have made improvements to a widely utilized computer model of memory, known as a Hopfield network, by incorporating insights from biology. The alteration has resulted in a network that not only better mirrors the way neurons and other cells are connected in the brain, but also has the capacity to store significantly more memories.