{"id":4592,"date":"2012-08-20T04:49:44","date_gmt":"2012-08-20T11:49:44","guid":{"rendered":"http:\/\/lifeboat.com\/blog\/?p=4592"},"modified":"2012-08-20T15:19:08","modified_gmt":"2012-08-20T22:19:08","slug":"enhanced-ai-the-key-to-unmanned-space-exploration","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2012\/08\/enhanced-ai-the-key-to-unmanned-space-exploration","title":{"rendered":"Enhanced AI: The Key to Unmanned Space Exploration"},"content":{"rendered":"<p>The precursor to manned space exploration of new worlds is typically unmanned exploration, and NASA has made phenomenal progress with remote controlled rovers on the Martian surface in recent years with MER-A Spirit, MER-B Opportunity and now MSL Curiosity. However, for all our success in reliance on AI in such rovers \u2014 similar if not more advanced to AI technology we see around us in the automotive and aviation industries \u2014 such as operational real-time clear-air turbulence prediction in aviation \u2014 such AI is typically to aid control systems and not mission-level decision making. NASA still controls via detailed commands transmitted to the rover directly from Earth, typically 225 kbit\/day of commands are transmitted to the rover, at a data rate of 1\u20132 kbit\/s, during a 15 minute transmit window, with larger volumes of data collected by the rover returned via satellite relay \u2014 a one-way communication that incorporates a delay of on average 12 or so light minutes. This becomes less and less practical the further away the rover is.<\/p>\n<p>If for example we landed a similar rover on Titan in the future, I would expect the current method of step-by-step remote control would render the mission impractical \u2014 Saturn being typically at least 16 times more distant \u2014 dependent on time of year.<\/p>\n<p>With the tasks of the science labs well determined in advance, it should be practical to develop AI engines to react to hazards, change course of analysis dependent on data processed \u2014 and so on \u2014 the perfect playground for advanced AI programmes. The current Curiosity mission incorporates tasks such as 1. Determine the mineralogical composition of the Martian surface and near-surface geological materials. 2. Attempt to detect chemical building blocks of life (bio-signatures). 3. Interpret the processes that have formed and modified rocks and soils. 4. Assess long-timescale (i.e., 4-billion-year) Martian atmospheric evolution processes. 5. Determine present state, distribution, and cycling of water and carbon dioxide. 6. Characterize the broad spectrum of surface radiation, including galactic radiation, cosmic radiation, solar proton events and secondary neutrons. All of these are very deterministic processes in terms of mapping results to action points, which could be the foundation for shaping such into an AI learning engine, so that such rovers can be entrusted with making their own mission-level decisions on next phases of exploration based on such AI analyses.<\/p>\n<p>Whilst the current explorations on Mars works quite well with the remote control strategy, it would show great foresight for NASA to engineer such unmanned rovers to operate in a more independent fashion with AI operating the mission-level control \u2014 learning to adapt to its environment as it explores the terrain, with only the return-link in use in the main \u2014 to relay back the analyzed data \u2014 and the low-bandwidth control-link reserved for maintenance and corrective action only. NASA has taken great strides in the last decade with unmanned missions. One can expect the next generation to be even more fascinating \u2014 and perhaps a trailblazer for advanced AI based technology.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The precursor to manned space exploration of new worlds is typically unmanned exploration, and NASA has made phenomenal progress with remote controlled rovers on the Martian surface in recent years with MER-A Spirit, MER-B Opportunity and now MSL Curiosity. However, for all our success in reliance on AI in such rovers \u2014 similar if not [\u2026]<\/p>\n","protected":false},"author":196,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[38,6,8],"tags":[],"class_list":["post-4592","post","type-post","status-publish","format-standard","hentry","category-engineering","category-robotics-ai","category-space"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/4592","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/196"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=4592"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/4592\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=4592"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=4592"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=4592"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}