Toggle light / dark theme

[1hr Talk] Intro to Large Language Models

This is a 1 hour general-audience introduction to Large Language Models: the core technical component behind systems like ChatGPT, Claude, and Bard. What they are, where they are headed, comparisons and analogies to present-day operating systems, and some of the security-related challenges of this new computing paradigm.
As of November 2023 (this field moves fast!).

Context: This video is based on the slides of a talk I gave recently at the AI Security Summit. The talk was not recorded but a lot of people came to me after and told me they liked it. Seeing as I had already put in one long weekend of work to make the slides, I decided to just tune them a bit, record this round 2 of the talk and upload it here on YouTube. Pardon the random background, that’s my hotel room during the thanksgiving break.

- Slides as PDF: https://drive.google.com/file/d/1pxx_ZI7O-Nwl7ZLNk5hI3WzAsTL…share_link (42MB)
- Slides. as Keynote: https://drive.google.com/file/d/1FPUpFMiCkMRKPFjhi9MAhby68MH…share_link (140MB)

Chapters:
Part 1: LLMs.
00:00:00 Intro: Large Language Model (LLM) talk.
00:00:20 LLM Inference.
00:04:17 LLM Training.
00:08:58 LLM dreams.
00:11:22 How do they work?
00:14:14 Finetuning into an Assistant.
00:17:52 Summary so far.
00:21:05 Appendix: Comparisons, Labeling docs, RLHF, Synthetic data, Leaderboard.
Part 2: Future of LLMs.
00:25:43 LLM Scaling Laws.
00:27:43 Tool Use (Browser, Calculator, Interpreter, DALL-E)
00:33:32 Multimodality (Vision, Audio)
00:35:00 Thinking, System 1/2
00:38:02 Self-improvement, LLM AlphaGo.
00:40:45 LLM Customization, GPTs store.
00:42:15 LLM OS
Part 3: LLM Security.
00:45:43 LLM Security Intro.
00:46:14 Jailbreaks.
00:51:30 Prompt Injection.
00:56:23 Data poisoning.
00:58:37 LLM Security conclusions.
End.
00:59:23 Outro

AI is at an inflection point, Fei-Fei Li says

Li is one of the tech leaders we interviewed for the latest issue of MIT Technology Review, dedicated to the biggest questions and hardest problems facing the world. We asked big thinkers in their fields to weigh in on the underserved issues at the intersection of technology and society. Read what other tech luminaries and AI heavyweights, such as Bill Gates, Yoshua Bengio, Andrew Ng, Joelle Pineau, Emily Bender, and Meredith Broussard, had to say here.

In her newly published memoir, The Worlds I See: Curiosity, Exploration, and Discovery at the Dawn of AI, Li recounts how she went from an immigrant living in poverty to the AI heavyweight she is today. It’s a touching look into the sacrifices immigrants have to make to achieve their dreams, and an insider’s telling of how artificial-intelligence research rose to prominence.

When we spoke, Li told me she has her eyes set firmly on the future of AI and the hard problems that lie ahead for the field.

DARPA — robots and technologies for the future management of advanced US research | PRO Robots

👉For business inquiries: [email protected].
✅ Instagram: https://www.instagram.com/pro_robots.

DARPA: robots and technologies for the future management of advanced US research. DARPA military robots. DARPA battle robots. Military technologies DARPA. Battle robots of the future. Technologies of the future in the US Army.

0:00 Introduction.
01:03 DARPA mission.
01:30 Project ARPANET
02:09 First “smart machine” or robot.
03:05 The first self-driving vehicles and the first Boston Dynamics robot.
03:31 DARPA robot racing.
04:08 First Boston Dynamics Big Dog four-legged robot.
04:43 Energy Autonomous Tactical Robot Program.
05:00 Engineering Living Materials Program.
05:45 Spy Beetles — Hybrid Insect Micro-Electro-Mechanical Systems.
06:03 Robot Worm — Project Underminer.
06:23 DARPA — The Systems-Based Neurotechnology for Emerging Therapies.
06:57 Robotic pilots with artificial intelligence.
07:30 Artificial Intelligence Combat Air System — Air Combat Evolution.
08:14 UNcrewed Long Range Ships — Sea Train.
09:24 Project OFFSET
10:15 Project Squad X
10:47 Battle of human robots on DARPA Robotics Challenge.

Defense Advanced Research Projects Agency, abbreviated DARPA, or the Office of Advanced Research Projects of the U.S. Department of Defense, was established in 1958, almost immediately after the launch of the USSR Sputnik-1. The realization that the Soviets were about to launch into space not only satellites, but also missiles, greatly cheered up the government of the United States. The result was the creation of a unique agency with a huge budget, which could be spent at its own discretion. Watch a selection of the most unexpected, strange and advanced projects in the field of technology and artificial intelligence DARPA in one video!

The Defense Advanced Research Projects Agency (DARPA) was established in 1958, in response to the USSR’s launch of Sputnik-1. DARPA’s mission is to create innovative defense technologies, and the agency’s projects have ranged from space-based missile shields to cyborg insects. Notably, DARPA has been involved in the creation of the internet, GPS, and Siri.

DARPA invests in projects to stimulate the development of technology and see where it leads. The agency’s first significant success was ARPANET, which laid the foundation for the modern internet. Moreover, DARPA’s computer vision, navigation, and planning techniques were fundamental to the development of robotics and web servers, video game development, and Mars rovers.

Newport Lecture Series: “Artificial Intelligence & Cognitive Warfare” with Yvonne Masakowski

Psychologist Yvonne R. Masakowski, Ph.D., a retired Associate Professor in the College of Leadership & Ethics at the USNWC, discusses the threat of psychological warfare in the 21st century and the disturbing possibilities that could shape how we think and act in the future. The Naval War College Foundation hosted this wide-ranging presentation — one of the most popular in our series — on February 23, 2022.

Networking nano-biosensors for wireless communication in the blood

Biological computing machines, such as micro and nano-implants that can collect important information inside the human body, are transforming medicine. Yet, networking them for communication has proven challenging. Now, a global team, including EPFL researchers, has developed a protocol that enables a molecular network with multiple transmitters.

First, there was the Internet of Things (IoT) and now, at the interface of computer science and biology, the Internet of Bio-Nano Things (IoBNT) promises to revolutionize medicine and health care. The IoBNT refers to biosensors that collect and , nano-scale Labs-on-a-Chip that run medical tests inside the body, the use of bacteria to design biological nano-machines that can detect pathogens, and nano-robots that swim through the bloodstream to perform targeted drug delivery and treatment.

“Overall, this is a very, very exciting research field,” explained Assistant Professor Haitham Al Hassanieh, head of the Laboratory of Sensing and Networking Systems in EPFL’s School of Computer and Communication Sciences (IC). “With advances in bio-engineering, , and nanotechnology, the idea is that nano-biosensors will revolutionize medicine because they can reach places and do things that current devices or larger implants can’t,” he continued.

Nvidia Q3 Earnings Explode On Surging Data Center AI And Gaming Demand

At this point, Nvidia is widely regarded as the 800 pound gorilla, when it comes to silicon and software for artificial intelligence.


Beyond AI, as I mentioned previously, all of Nvidia’s BUs realized quarterly growth. Though its Automotive group rose a modest 3% to $261M, the company’s automotive design win pipeline is projected at $14 billion in new business (numbers soon to be updated). Automotive design wins have a longer gestation period, and the company has noted that this revenue impact opportunity will begin materializing in 2024 and beyond. Shifting to Nvidia’s Professional Visualization business unit, sequential growth of 9.8% to $416 million was achieved, while the company’s OEM And Other business grew 10.6% to $73 million. Finally, Nvidia’s Gaming group delivered $2.856 billion for the quarter, compared to $2.49 billion in its previous Q2 quarter (up about 15%), and $2.24 billion quarter on quarter from a year ago. Here again, the company’s gaming GPUs and software are widely respected as the performance and feature leaders currently in the PC Gaming industry, though its chief rival AMD is beginning to execute better with its Radeon product line, along with its potent Ryzen CPUs as a 1–2 punch platform solution.

Moving forward, the company guided for a nice round $20 billion for its Q4 FY24 number, representing a projected 11% sequential gain. There will be a bit of headwind of course, from competitors like AMD that is expected to deliver its MI300 GPU AI accelerators in December at its Advancing AI event. That said, it’s going to be a tough slog for all competitors, due to Nvidia’s long-building inertia as the clear leader and incumbent in AI. Another component of the company’s data center silicon portfolio is just coming online now as well, with its Grace-Hopper combined CPU-GPU Superchip, competing for host processor AI data center sockets, which Huang noted is “on a very, very fast ramp with our first data center CPU to a multi-billion dollar product line.”

Any way you slice it, there’s no stopping Nvidia from this level of growth for the foreseeable future, as AI adoption tracks a similar curve. The company continues to execute like a finely tuned machine, and the numbers, as they say, don’t lie.

US chip export ban is hurting China’s AI startups, not so much the giants yet

Well before Washington banned Nvidia’s exports of high-performance graphic processing units to China, the country’s tech giants had been hoarding them in anticipation of an escalating tech war between the two nations.

Baidu, one of the tech firms building China’s counterparts to OpenAI, has secured enough AI chips to keep training its ChatGPT equivalent Ernie Bot for the “next year or two,” the firm’s CEO Robin Li said on an earnings call this week.

“Also, inference requires less powerful chips, and we believe our chip reserves, as well as other alternatives, will be sufficient to support lots of AI-native apps for the end users,” he said. “And in the long run, having difficulties in acquiring the most advanced chips inevitably impacts the pace of AI development in China. So, we are proactively seeking alternatives.”

/* */