Toggle light / dark theme

https://youtube.com/watch?v=R0NP5eMY7Q8&feature=share

Quantum algorithms: An algorithm is a sequence of steps that leads to the solution of a problem. In order to execute these steps on a device, one must use specific instruction sets that the device is designed to do so.

Quantum computing introduces different instruction sets that are based on a completely different idea of execution when compared with classical computing. The aim of quantum algorithms is to use quantum effects like superposition and entanglement to get the solution faster.

Source:
Artificial Intelligence vs Artificial General Intelligence: Eric Schmidt Explains the Difference.

https://youtu.be/VFuElWbRuHM

Disclaimer:

Tool use has long been a hallmark of human intelligence, as well as a practical problem to solve for a vast array of robotic applications. But machines are still wonky at exerting just the right amount of force to control tools that aren’t rigidly attached to their hands.

To manipulate said tools more robustly, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with the Toyota Research Institute (TRI), have designed a system that can grasp tools and apply the appropriate amount of force for a given task, like squeegeeing up liquid or writing out a word with a pen.

The system, dubbed Series Elastic End Effectors, or SEED, uses soft bubble grippers and embedded cameras to map how the grippers deform over a six-dimensional space (think of an airbag inflating and deflating) and apply force to a tool. Using six degrees of freedom, the object can be moved left and right, up or down, back and forth, roll, pitch, and yaw. The closed-loop controller—a self-regulating system that maintains a desired state without —uses SEED and visuotactile feedback to adjust the position of the robot arm in order to apply the desired force.

Those who are venturing into the architecture of the metaverse, have already asked themselves this question. A playful environment where all formal dreams are possible, where determining aspects for architecture such as solar orientation, ventilation, and climate will no longer be necessary, where – to Louis Kahn’s despair – there is no longer a dynamic of light and shadow, just an open and infinite field. Metaverse is the extension of various technologies, or even some call them a combination of some powerful technologies. These technologies are augmented reality, virtual reality, mixed reality, artificial intelligence, blockchain, and a 3D world.

This technology is still under research. However, the metaverse seems to make a significant difference in the education domain. Also, its feature of connecting students across the world with a single metaverse platform may bring a positive change. But, the metaverse is not only about remote learning. It is much more than that.

Architecture emerged on the construction site, at a time when there was no drawing, only experimentation. Over time, thanks to Brunelleschi and the Florence dome in the 15th century, we witnessed the first detachment from masonry, a social division of labor from which liberal art and mechanical art emerge. This detachment generated different challenges and placed architecture on an oneiric plane, tied to paper. In other words, we don’t build any structures, we design them. Now, six centuries later, it looks like we are getting ready to take another step away from the construction site, abruptly distancing ourselves from engineering and construction.

Engineered living materials promise to aid efforts in human health, energy and environmental remediation. Now they can be built big and customized with less effort.

Bioscientists at Rice University have introduced centimeter-scale, slime-like colonies of engineered that self-assemble from the bottom up. They can be programmed to soak up contaminants from the environment or to catalyze biological reactions, among many possible applications.

The creation of autonomous —or ELMs—has been a goal of bioscientist Caroline Ajo-Franklin since long before she joined Rice in 2019.

Following the success of the inaugural competition in 2021, Amazon is officially launching the Alexa Prize TaskBot Challenge 2. Starting today, university teams across the globe can apply to compete in developing multimodal conversational agents that assist customers in completing tasks requiring multiple steps and decisions. The first-place team will take home a prize of $500,000.

The TaskBot Challenge 2, which will begin in January 2023, addresses one of the hardest problems in conversational AI — to create next-generation conversational AI experiences that delight customers by addressing their changing needs as they complete complex tasks. It builds upon the Alexa Prize’s foundation of providing universities a unique opportunity to test cutting edge machine learning models with actual customers at scale.

Amid the festivities at its fall 2022 GTC conference, Nvidia took the wraps off new robotics-related hardware and services aimed at companies developing and testing machines across industries like manufacturing. Isaac Sim, Nvidia’s robotics simulation platform, will soon be available in the cloud, the company said. And Nvidia’s lineup of system-on-modules is expanding with Jetson Orin Nano, a system designed for low-powered robots — plus a new platform called IGX.

Isaac Sim, which launched in open beta last June, allows designers to simulate robots interacting with mockups of the real world (think digital re-creations of warehouses and factory floors). Users can generate datasets from simulated sensors to train the models on real-world robots, leveraging synthetic data from batches of parallel, unique simulations to improve the model’s performance.

It’s not just marketing bluster, necessarily. Some research suggests that synthetic data has the potential to address many of the development challenges plaguing companies attempting to operationalize AI. MIT researchers recently found a way to classify images using synthetic data, and nearly every major autonomous vehicle company uses simulation data to supplement the real-world data they collect from cars on the road.

Using artificial intelligence and editing software, photographer Alper Yesiltas has resurrected stars who died when they were young.

The Turkey-based photographer, who created the portraits for a project titled ‘As If Nothing Happened,’ said ‘’With the development of AI technology, I’ve been excited for a while, thinking that anything imaginable can be shown in reality.’’

Sharing the haunting and realistic images on his Instagram handle, Yesiltas said ‘’When I started tinkering with technology, I saw what I could do and thought about what would make me the happiest. I wanted to see some of the people I missed again in front of me and that’s how this project emerged.’‘.

In this talk, Kurzweil explores the history and trajectory of advances in computing and Information Technology to project how he believes Artificial Intelligence (AI) may enhance our natural biological intelligence in the future.

Kurzweil spoke at the Nobel Week Dialogue on December 9, 2015 in Gothenburg, Sweden.

Nobel Week Dialogue is a free of charge, full-day event and part of the official Nobel Week programme. The event aims to stimulate discussion at the highest level on a topical science-related theme by bringing together Nobel Laureates, the world’s leading scientists and experts, key opinion leaders, policy makers and the general public, online as well as on site. By bridging science and society, it’s an opportunity to stimulate thinking, excite imagination and inspire greatness! http://www.nobelweekdialogue.org