Toggle light / dark theme

  • Fraud detection techniques mostly stem from the anomaly detection branch of data science.
  • If the dataset has sufficient number of fraud examples, supervised machine learning algorithms for classification like random forest, logistic regression can be used for fraud detection.
  • If the dataset has no fraud examples, we can use either the outlier detection approach using isolation forest technique or anomaly detection using the neural autoencoder.
  • After the machine learning model has been trained, it’s evaluated on the test set using metrics such as sensitivity and specificity, or Cohen’s Kappa.

With global credit card fraud loss on the rise, it is important for banks, as well as e-commerce companies, to be able to detect fraudulent transactions (before they are completed).

According to the Nilson Report, a publication covering the card and mobile payment industry, global card fraud losses amounted to $22.8 billion in 2016, an increase of 4.4% over 2015. This confirms the importance of the early detection of fraud in credit card transactions.

Several companies, like SignAll and Kintrans, have created hand-tracking software that tries, with little success so far, to allow the millions of people that use sign language and an app to easily communicate with anyone.

Now, a new hand-tracking algorithm from Google’s AI labs might be a big step in making this ambitious software everything it originally promised.

RELATED: THIS SMARTPHONE APP CAN SAVE YOUR LIFE WITH JUST 3 WORDS

Some call it “strong” AI, others “real” AI, “true” AI or artificial “general” intelligence (AGI)… whatever the term (and important nuances), there are few questions of greater importance than whether we are collectively in the process of developing generalized AI that can truly think like a human — possibly even at a superhuman intelligence level, with unpredictable, uncontrollable consequences.

This has been a recurring theme of science fiction for many decades, but given the dramatic progress of AI over the last few years, the debate has been flaring anew with particular intensity, with an increasingly vocal stream of media and conversations warning us that AGI (of the nefarious kind) is coming, and much sooner than we’d think. Latest example: the new documentary Do you trust this computer?, which streamed last weekend for free courtesy of Elon Musk, and features a number of respected AI experts from both academia and industry. The documentary paints an alarming picture of artificial intelligence, a “new life form” on planet earth that is about to “wrap its tentacles” around us.

NEW DELHI (AP) — An unmanned spacecraft India launched last month began orbiting the moon Tuesday as it approaches the lunar south pole to study previously discovered water deposits.

The Indian Space Research Organization said it successfully maneuvered Chandrayaan-2, the Sanskrit word for “moon craft,” into lunar orbit, nearly a month after it left Earth. The mission is led by two female scientists.

Chandrayaan will continue circling the moon in a tighter orbit until reaching a distance of about 100 kilometers (62 miles) from the moon’s surface.

WASHINGTON — If you’ve ever tried to swat a fly, you know that insects react to movement extremely quickly. A newly created biologically inspired compound eye is helping scientists understand how insects use their compound eyes to sense an object and its trajectory with such speed. The compound eye could also be used with a camera to create 3D location systems for robots, self-driving cars and unmanned aerial vehicles.

In The Optical Society (OSA) journal Optics Letters, researchers from Tianjin University in China report their new bio-inspired compound eye, which not only looks like that of an insect but also works like its natural counterpart. Compound eyes consist of hundreds to thousands of repeating units known as ommatidia that each act as a separate visual receptor.

“Imitating the vision system of insects has led us to believe that they might detect the trajectory of an object based on the light intensity coming from that object rather than using precise images like human vision,” said Le Song, a member of the research team. “This motion-detection method requires less information, allowing the insect to quickly react to a threat.”

Robots from all over the world are about to go on a subterranean adventure, competing against each other in mining tunnels to determine which ones can best navigate and find objects underground and do so autonomously.

The Defense Advanced Research Projects Agency (DARPA) is hosting the Subterranean Challenge Systems Competition on Aug. 15–22 in Pittsburgh as a way to develop technology for the military and first responders to map and search subterranean areas.

After breaking all the records related to training computer vision models, NVIDIA now claims that it’s AI platform is able to train a natural language neural network model based on one of the largest datasets in a record time. It also claims that the inference time is just 2 milliseconds which translates to an extremely fast response from the model participating in a conversation with a user.

After computer vision, natural language processing is one of the top applications of AI. From Siri to Alexa to Cortana to Google Assistant, all conversational user experiences are powered by AI.

The advancements in AI research is putting the power of language understanding and conversational interface into the hands of developers. Data scientists and developers can now build custom AI models that work exactly like Alexa and Siri but for a specialized and highly customized industry use case from the healthcare or legal vertical. This enables doctors and lawyers to interact with expert agents that can understand the terminology and the context of the conversation. This new user experience is going to be a part of future line of business applications.

Harvard University researchers have developed a new powered exosuit that can make you feel as much as a dozen pounds lighter when walking or running. Scientific American reports that the 11-pound system, which is built around a pair of flexible shorts and a motor worn on the lower back, could benefit anyone who has to cover large distances by foot, including recreational hikers, military personnel, and rescue workers.

According to the researchers, who have published their findings in the journal Science, this system differs from previous exosuits because it’s able to make it easier to both walk and run. The challenge, as shown by a video accompanying the research, is that your legs work very differently depending on whether you’re walking or running. When walking, the team says your center of mass moves like an “inverted pendulum,” while running causes it to move like a “spring-mass system.” The system needs to be able to accommodate both of them, and sense when the wearer’s gait changes.