Toggle light / dark theme

Historical accounts of the mortality outcomes of the Black Death plague pandemic are variable across Europe, with much higher death tolls suggested in some areas than others. Here the authors use a ‘big data palaeoecology’ approach to show that land use change following the pandemic was spatially variable across Europe, confirming heterogeneous responses with empirical data.

Smart cities are supposed to represent the pinnacle of technological and human advancement. They certainly deliver on that promise from a technological standpoint. Smart cities employ connected IoT networks, AI, computer vision, NLP, blockchain and similar other technologies and applications to bolster urban computing, which is utilized to optimize a variety of functions in law enforcement, healthcare, traffic management, supply chain management and countless other areas. As human advancement is more ideological than physical, measuring it comes down to a single metric—the level of equity and inclusivity in smart cities. Essentially, these factors are down to how well smart city administrators can reduce digital exclusivity, eliminate algorithmic discrimination and increase citizen engagement. Addressing the issues related to data integrity and bias in AI can resolve a majority of inclusivity problems and meet the above-mentioned objectives. make smart cities more inclusive for people and communities from all strata of society, issues related to digital exclusion and bias in AI need to be addressed by public agencies in these regions.

The algorithms spot and classify synthetic-material objects based on the distinctive manner in which they reflect polarized light. Polarized light reflected from human-made objects often differs from natural objects, such as vegetation, soil, and rocks.

The researchers tested such a camera, both on the ground and from a US Coast Guard helicopter, which was flying at the altitude at which the polarimetric-camera-equipped drones will fly.

Once fully operational, data collected by the drone-based machine learning system will be used to make maps that show where marine debris is concentrated along the coast to guide rapid response and removal efforts. The researchers will provide NOAA Marine Debris Program staff with training in the use of the new system, along with standard operating procedures manual.

Quantum computing and machine learning are two of the most exciting technologies that can transform businesses. We can only imagine how powerful it can be if we can combine the power of both of these technologies. When we can integrate quantum algorithms in programs based on machine learning, that is called quantum machine learning. This fascinating area has been a major area of tech firms, and they have brought out tools and platforms to deploy such algorithms effectively. Some of these include TensorFlow Quantum from Google, Quantum Machine Learning (QML) library from Microsoft, QC Ware Forge built on Amazon Braket, etc.

Students skilled in working with quantum machine learning algorithms can be in great demand due to the opportunities the field holds. Let us have a look at a few online courses one can use to learn quantum machine learning.

In this course, the students will start with quantum computing and quantum machine learning basics. The course will also cover topics on building Qnodes and Customised Templates. It also teaches students to calculate Autograd and Loss Function with quantum computing using Pennylane and to develop with the Pennylane.ai API. The students will also learn how to build their own Pennylane Plugin and turn Quantum Nodes into Tensorflow Keras Layers.

Connecting & enabling a smarter planet — alistair fulton, VP, wireless & sensing products, semtech.


Alistair Fulton (https://www.semtech.com/company/executive-leadership/alistair-fulton) is the Vice President and General Manager of Semtech’s Wireless and Sensing Products Group.

Semtech Corporation is a supplier of analog and mixed-signal semiconductors and advanced algorithms for consumer, enterprise computing, communications and industrial end-markets. It has 32 locations in 15 countries in North America, Europe, and Asia.

Machine learning can work wonders, but it’s only one tool among many.

Artificial intelligence is among the most poorly understood technologies of the modern era. To many, AI exists as both a tangible but ill-defined reality of the here and now and an unrealized dream of the future, a marvel of human ingenuity, as exciting as it is opaque.

It’s this indistinct picture of both what the technology is and what it can do that might engender a look of uncertainty on someone’s face when asked the question, “Can AI solve climate change?” “Well,” we think, “it must be able to do *something*,” while entirely unsure of just how algorithms are meant to pull us back from the ecological brink.

Such ambivalence is understandable. The question is loaded, faulty in its assumptions, and more than a little misleading. It is a vital one, however, and the basic premise of utilizing one of the most powerful tools humanity has ever built to address the most existential threat it has ever faced is one that warrants our genuine attention.

Full Story:

Machine learning, a form of artificial intelligence, vastly speeds up computational tasks and enables new technology in areas as broad as speech and image recognition, self-driving cars, stock market trading and medical diagnosis.

Before going to work on a given task, algorithms typically need to be trained on pre-existing data so they can learn to make fast and accurate predictions about future scenarios on their own. But what if the job is a completely new one, with no data available for training?

Now, researchers at the Department of Energy’s SLAC National Accelerator Laboratory have demonstrated that they can use machine learning to optimize the performance of particle accelerators by teaching the algorithms the basic principles behind operations—no prior data needed.