Toggle light / dark theme

If you’re of a certain age, you probably know know Huntsville, Alabama’s Space Camp best as a prize for winning a ridiculous competition show. And if you ever obsessed over going on that cosmic retreat, you probably wanted to get on that weird spinning chair they always showed in the clips. It’s a serious looking device at a serious facility–what the heck is it for?

I was recently lucky enough to make a childhood dream come true and zipped up my flight suit for a shot at Space Camp. There, as I explain in the video above, I learned that the spinning chair has a more formal name: the Multi Axis Trainer, or MAT. It’s used to give riders a feeling of what it’s like to uncontrollably tumble through space.

Read more

How many times have you heard someone say that the pursuit of beauty, or of its preservation over time, is a “vain” endeavor? My guess would be probably many. That’s why you need to tread carefully if you plan to present the preservation of looks as an argument in favor of rejuvenation biotechnology—you might be stepping into a minefield.

Quite frankly, I never got what’s so wrong with wanting to maintain youthful beauty over time, and I’d tend to think we’re dealing with a fox-and-grapes situation here.

Read more

Neural networks have started to take off since AlexNet in 2012. We don’t have to call it a software war, but there’s a competition for mindshare and community contributors in neural networks.

Of course, AI needs more than a neural network library, it needs the configuration hyperparameters, training datasets, trained models, test environments, and more.

Most people have heard of Google’s Tensorflow which was released at the end of 2015, but there’s an active codebase called PyTorch which is easier to understand, less of a black box, and more dynamic. Tensorflow does have solutions for some of those limitations (such as Tensorflow-fold, and Tensorflow-Eager) but these new capabilities remove the need for other features and complexity of Tensorflow. Google built a high-performance system for doing static computation graphs before realizing that most people want dynamic graphs. Doh!

And how much do you trust Google, anyway?

PyTorch was created by people from Idiap Research Institute in Switzerland, who went to Facebook and Google. Doh!

I posted a bug report on the PyTorch license, asking for a copyleft one: https://github.com/pytorch/pytorch/issues/5270

I think you should consider a copyleft license. I realize it’s a pain to change the license, but it never gets easier. I read the license and it’s mostly a disclaimer and a warning. There’s nothing in there about protecting the freedom of the users.

There are lots of projects with lax licenses that are successful, so maybe it will work out okay, but the Linux kernel took off because of the copyleft license. It nudges people to give back.

Lax licenses let companies take advantage of the individual contributors. I don’t understand how someone who believes in free software also believes letting big companies turn it back into proprietary software is fine.

I realize lawyers might like that, and proprietary software companies might want it, but this group is more than just those people. It’s great you’ve got 100s of contributors already, but if you know the way corporations work, you should be pushing for copyleft.

My bug was closed within 8 hours with the following response from a Facebook employee:

we’ve definitely thought about this in the past. We have no plans of changing our license.

The bug was closed but I could keep commenting:

When you say “we”, are you talking about Facebook or the random smaller contributors? Given you work for a large company, I hope you realize you could be biased. At the same time, you should know the way large corporations work even better. You won’t be there forever. Copyleft is stronger protection for the software and the users, do you disagree?

When you say “thought”, have you written any of it down with a link you can post for archival purposes? That way if others come along, they’ll have a good answer. I may quote your non-defense of your lax license in my writings if you don’t mind, but I’d prefer if you gave me a bit more.

I just spend several minutes looking for a discussion on PyTorch license, and came up with nothing except another bug report closed with a similar short answer.

Your last dismissive answer could motivate people to create a copyleft fork!

I got one more response:

We = the authors of the project.

“thought” = this is a topic that came up in the past, we discussed it among ourselves. I don’t have it written down, we don’t plan to have it written down.

I wrote one more response:

It don’t know any of these names:
https://www.openhub.net/p/pytorch/contributors

I don’t know who the authors are of this project, and how much is big companies versus academics and small contributors, how much interest there is in making a copyleft version, etc.

BTW, relicensing would get you plenty of news articles. It’s also tough because Facebook doesn’t have the same reputation as the FSF or EFF for protecting user’s freedom. The Tensorflow license is lax also so you don’t have that competitive advantage.

To some it’s a disadvantage, but it did make a difference in the Linux scheme, and you would hope to have your work be relevant for that long, and without a bunch of proprietary re-implementations over time that are charged for. The lax license could also slow software innovation because everyone is mostly improving their secret code on top.

LibreOffice was able to convince a lot of people that a copyleft license was better than the OpenOffice scheme, but I don’t know what people here think. One interesting data point would be to find out what percent of the patches and other work are by small contributors.

Anyway, you’ve got a cool project, and I wish you the best, partially because I don’t trust Google. Tensorflow is just some sample code for others to play with while they advance the state of the art and keep 95% proprietary. It also seems they made a few mistakes in the design and now will carry baggage.

There is a deep learning software wars going on. It’s kind of interesting to almost be on the side of Facebook wink

It’s a shame that copyleft seems to be losing mindshare. If the contributors who like copyleft lit some torches, and created a fork, or threatened to, it could get the attention of the large corporations and convince them to relicense rather than risk the inefficiencies, bad press, slower progress and loss of relevance. Forks are a bad thing, but copyleft can prevent future forks, and prevent people from taking but not giving back.

Whether a PyTorch fork makes sense depends on a number of factors. The LibreOffice fork was created because people were unhappy about how Sun and then Oracle were working with the community, etc. If the only thing wrong with PyTorch is the lax license, it might become successful without needing the copyleft nudge, but how much do you trust Facebook and Google to do the right thing long-term?

I wish PyTorch used the AGPL license. Most neural networks are run on servers today, it is hardly used on the Linux desktop. Data is central to AI and that can stay owned by FB and the users of course. The ImageNet dataset created a revolution in computer vision, so let’s not forget that open data sets can be useful.

A license like the GPL wouldn’t even apply to Facebook because the code runs on servers, but it would make a difference in other places where PyTorch could be used. You’d think Facebook could have just agreed to use a GPL or LGPL license, and silently laugh as they know the users don’t run their AI software.

Few people run Linux kernels remotely so the GPL is good enough for it. Perhaps it isn’t worth making a change to the PyTorch license unless they switch to AGPL. Or maybe that’s a good opening bid for those with torches and pitchforks.

I posted a link to this on the Facebook Machine Learning group, and my post was deleted and I was banned from the group!

I posted a link to the Google Deep Learning group and got some interesting responses. One person said that copyleft is inhibiting. I replied that if keeping free software free is inhibiting, there isn’t a word to describe the inhibitions with proprietary software!

One of the things I notice is that even though many people understand and prefer copyleft, they often encourage a lax license because they think other people want that also. There are a lot of people pushing for lax licenses even though they actually prefer copyleft.

People inside Facebook and Google know the pressure to write proprietary code better than those outside. They should be pushing for copyleft the most! On Reddit, someone suggested the MPL license. It does seem another reasonable compromise similar to LGPL.

In 2017, synthetic biology companies raised a record amount of funding – just over $1.8 billion for fifty two companies – driven mostly by several multi-hundred million dollar deals. This was a 50% increase over the previous year, a pace of growth that indicated an intense interest in the field from outside investors. It seems that this interest has only intensified since then, as 27 companies raised $650 million in funding during the first quarter of 2018, which is double the activity of the first quarter of 2017. At this rate, the field is on track to raise over $2.4 billion with over 100 companies being funded, which would be a record for both statistics.

Synthetic Biology Companies Funding

The companies raising money in 2018 are pursuing a broadly diverse set of applications from all sections of the synthetic biology technology stack. Many companies are developing products that will eventually end up in the hands (or bodies) of everyday consumers, but others are making the tools and reagents that will empower the whole field to become more productive. It is important that all of these types of companies exist in order to build a healthy industry ecosystem.

Read more

Mitsubishi Hitachi Power Systems (MHPS) and Carnegie Mellon University (CMU) today announced the release of the 2018 Carnegie Mellon Power Sector Carbon Index, at CMU Energy Week, hosted by the Wilton E. Scott Institute for Energy Innovation. The Index tracks the environmental performance of U.S. power producers and compares current emissions to more than two decades of historical data collected nationwide. This release marks the one-year anniversary of the Index, developed as a new metric to track power sector carbon emissions performance trends.

“The Carnegie Mellon Power Sector Carbon Index provides a snapshot of critical data regarding energy production and environmental performance,” said Costa Samaras, Assistant Professor of Civil and Environmental Engineering. “We’ve found this index to provide significant insight into trends in generation and emissions. In particular, the data have shown that emissions intensity has fallen to the lowest level on record, as a combination of natural gas and renewable power have displaced more intensive coal-fired power generation.”

The latest data revealed the following findings: U.S. power plant emissions averaged 967 lb. CO2 per megawatt hour (MWh) in 2017, which was down 3.1 percent from the prior year and down 26.8 percent from the annual value of 1,321 lb CO2 per MWh in 2005 The result of 2016 was initially reported as 1,001 lb/MWh, but was later revised downward to 998 lb/MWh.

Read more

Postponing Day Zero in Cape Town for 2018 comes as no surprise. There was no sense to it once the day had been pushed into the winter rainfall period. It also didn’t make sense for the Western Cape and Cape Town governments to continue drafting detailed logistical plans for points of water distribution in the event that taps were turned off across the city.

But Cape Town’s remain at high risk because the long-term predictions for rainfall in the south-western Cape remain uncertain. Dam levels continue to fall while people are struggling to achieve the ’s target of 450 million litres per day. And yields from new water schemes will only be known in the coming months and next year.

The general perception is that the onset of climate change would be slow and measured. This would afford authorities the time to intervene with considered plans. But climate change is a disrupter and takes no prisoners. Over the past three years, Cape Town and the surrounding regions has experienced successive years of well below average rainfall. The experience is changing the way people think about water and how it is managed.

Read more

YES!!!


Scientists at the Department of Energy’s National Renewable Energy Laboratory (NREL) have discovered a new approach for developing a rechargeable non-aqueous magnesium-metal battery.

A proof-of-concept paper published in Nature Chemistry detailed how the scientists pioneered a method to enable the reversible of magnesium metal in the noncorrosive carbonate-based electrolytes and tested the concept in a prototype cell. The technology possesses potential advantages over lithium-ion batteries—notably, higher density, greater stability, and lower cost.

NREL researchers (from left) Seoung-Bum Son, Steve Harvey, Andrew Norman and Chunmei Ban are co-authors of the Nature Chemistry white paper, “An Artificial Interphase Enables Reversible Magnesium Chemistry in Carbonate Electrolytes” working with a Time-of-flight secondary ion mass spectrometry. The device allows them to investigate material degradation and failure mechanisms at the micro- to nano-scale. (Photo by Dennis Schroeder / NREL)

Australia’s major cities are growing more rapidly than ever before, gaining three million residents in a decade. Concerns about the risks to their long-term liveability and health are growing too. Is the consistent placing of Australian cities at the top of most liveable city rankings a reason for complacency?

The fastest-growing city, Melbourne, is experiencing unprecedented growth and yet has topped The Economist Intelligence Unit global liveability ranking for seven years running. However, much like Australia’s remarkable record of 26 years of continuous economic growth, many of the policy and institutional reforms that delivered this liveability legacy occurred decades ago.

Australia is now undergoing its third great wave of population growth, putting pressure on infrastructure, services and the environment. During the past two waves of growth, in the late-19th and mid-20th centuries, cities implemented visionary responses. It’s largely because of these past phases of planning and investment that our cities have until now been able to sustain their liveability and a reasonably healthy natural environment.

Read more