Speaking of vaccines
Speaking of vaccines, some Spanish scientists are on to one that has proven 90% effective against HIV in stage 1 trials.
Could this be the year we start to wipe out both AIDS and Malaria? That's pretty fantastic.
By the way, this is a perfect example of why it pisses me off when smarmy Silicon Valley slimeballs refer to their industry as "the tech industry", as if computers were the only kind of technology. Biologists (to name just one field) are doing stuff right now that's just as cutting edge and is about a million times more important in terms of its effect on the quality of human life.
Crowdfunding vs. DRM as the future of publishing
None of the songs I bought from the iTunes store will play any more because Apple thinks I've authorized them on too many computers; and I can't remember my Battle.net password so as far as Blizzard's concerned I no longer own that copy of Starcraft 2 I paid $60 for.
It used to be I only had to worry about losing my digital "posessions" when a magnet got near my disk drive or when an OS upgrade made my old data formats obsolete, but now... well, let's say I'm very reluctant to pay real money for an intangible electronic "product" when it can be taken away from me any time at the whim of an overzealous and glitchy DRM scheme.
This is why I'm not real keen on the idea of e-books; I like books that I can trust to stay on my shelf and continue existing even if the publisher changes their mind. Sushu's got several Kindles and was telling me about how you can now "loan" e-books to other people - the book is gone from your own Kindle for two weeks, then it comes back. (She likes this because books that she loans out the old-fashioned way pretty much never come back to her, she says.)
It's weird to think that some programmer had to write code whose sole function is to take a file that's still there on your Kindle and lock you out of it for two weeks. I imagine him at a Starbucks, swapping tips with the programmer from Blizzard who prevents users from playing Diablo 3 single-player without a connection to Blizzard's servers.
On a computer, every "move file" operation on a computer is actually a "copy file" followed by a "delete original". The "delete original" step is optional. The default state is for everybody to have as many copies of a file as they want; to reproduce the scarcity of the physical world takes work. Companies are paying workers to make there be less of their products.
Yeah, yeah, yeah, I know. If we let everybody have copies of all the books they wanted for free, then writers couldn't get paid, and we wouldn't have any new books at all. I get that. It's just that, as people have been saying since at least the 90s, the publishing industry should really be coming up with new business models instead of trying to fight technological progress.
For a while we thought that new business model would be advertising. But web advertising has mutated into a creepy track-you-everywhere commercial panopticon, even as advertising fails to sustain print media. The value of web advertising is dropping as well. Advertisers can now see exactly how few people are clicking on their ads, and offer prices accordingly. Besides, I think relying on advertising too much puts the creators into an unhealthy relationship with readers: if the advertiser, rather than the reader, is the one paying your rent, then you have the incentive to do what the advertiser wants, even if the reader doesn't want it.
Lots of creative people on the web have moved to merchandise-supported model. That's great if it works for them, but many types of work (say, non-fiction books) that don't lend themselves to merchandise at all. And besides, there's only so many T-shirts the average comic-reading nerd can fit in their closet. Mechandise seems very limiting.
I donated to my first project thinking "huh, one of those ransom model things? OK, well, they won't take my money unless funding succeeds, so there's not much to lose; let's try it". I didn't think much more about it at the time. But as I've watched Kickstarters get more and more attention over the past few months I'm starting to think Kickstarter, or something like it, might be the answer.
(Obviously Kickstarter did not invent the ransom model of publishing; I know Stephen King did a book that way over ten years ago.)
But here's the thing: Kickstarter-style crowd-funding is one of the very few ways where the creator is actually getting paid for doing the work of creation. With advertising you get paid for delivering customer eyeballs to advertising, and that indirectly funds the creation of the work. Even with traditional publishing, the money comes from rectangular masses of dead tree pulp being shipped around to stores, and the sales of these objects refund the publisher for the advance they gave the author for work already completed.
The work of a creator is to make a thing exist which never existed before. Kickstarter relates this to money in a very direct way: if enough fans say "Yes, I am a potential audience member, and it's worth $X to me for this thing to exist", then they pool their money and the creator gets it. And the successful Kickstarters generally seem to be the ones where the creator explains why they need that amount of money, and what exactly it will be put towards -- the ones where the costs are transparent and justifiable, in other words.
I could even see somebody in the future making a living off of one crowd-funded project after another, setting the funding targets of the projects to cover all their living costs, and not even having to care about piracy or DRM or artificial scarcity. Who cares if some people get a pirate copy, if you've already been paid the value of your time and labor for making the thing exist?
Maybe the bigger risk is that a "creator" will take everybody's money and then never deliver the work. There has been at least one high-profile attempted Kickstarter scam already, but people got wise to it before it was funded and it got taken down. Sooner or later somebody will do a scam competent enough to succeed. It will be interesting to see what happens to Kickstarter then.
I read this interesting article today about how the Kickstarter website doesn't show you the 56% of projects that fail to meet their funding target. He says 56% like it's a bad thing. A 44% success rate is amazing, far higher than I imagined. And it's good that some projects don't get funded. The funding process is a way of gauging interest. If the interest isn't there, won't you be glad to find that out up front? You don't waste time making the thing and you don't go into debt financing it.
So yeah, projects fail. There are still no guarantees of success. Getting publicity for your kickstarter is still hard. There is only a finite amount of donor money out there, and a finite amount of donor attention. (Attention may be even scarcer than money). People who are already famous from other projects have a huge advantage getting attention for their Kickstarter campaign.
But none of those problems are new. It's always been hard for first-time creators to get attention for their work. There's always been competition for a limited number of audience dollars. That's part of the service that publishers provide - they know how to generate publicity. In fact, generating publicity may soon be the only function of publishers that technology does not render obsolete. (Well, that and editing. Editing is a valuable service and most stuff published on the internet would be a lot better if it had some!)
Maybe in the future, a "publisher" will be somebody you hire to manage your crowd-funding campaign for you? And the trustworthiness of the publisher's brand will be part of what convinces potential donors that you're not a scam -- that they can trust you to actually finish making the thing. It's also a reassurance that you meet somebody's standard of quality.
After all, there may be no limits on file duplication, but there are still limits on audience attention span, so that's the resource we need to pay attention to. The future will be interesting!
Science/Technology link roundup, June 2012
China's first manned space docking mission was a success. A "Long March"rocket carrying three astronauts rendezvoused with the Tiangong 1 space-lab module which was put into orbit last year.
Japanese biologists grow a human eye precursor from stem cells. Not yet a functioning eyeball, but an embryonic proto-eyeball structure called an "optic cup". We've long known that in principle the structure of all the body's systems and organs is encoded in the DNA of a single cell, but this is the first time that a complex three-dimensional structure has been grown on its own from human cells. A future where we can grow replacement organs from our own DNA is going to be pretty cool.
Scientists in Long Island have mapped out the "wiring" of the mouse brain. This is not the same as knowing what every neuron in the mouse brain does. It's the equivalent of sequencing the genome, which doesn't tell us what every gene does but does provide the vital high-level framework and context for future exploration. This is the first vertebrate brain to be mapped at this level of detail and is considered a first step towards mapping the human brain. The team has made lots of hi-res images publicly available.
Google Research has trained a 9-layer deep neural network to recognize faces based on an unlabeled data set. The cool thing here is that the training data is unlabeled. Usually when you train a classifier to recognize faces, you have to give it a set of pictures with faces and a set of pictures without faces, so it can learn what to look for. In this research, Google just fed the neural network thousands of pictures without any labels and it learned to tell features apart, without any knowledge of what it was supposed to be looking for. It didn't just learn to recognize faces; it also learned to recognize several other common picture elements, such as the presence of cats.
Meanwhile, Microsoft has been using the same sort of "deep neural network" method to train a system for better speech recognition; they're using it to make audio files searchable, which is pretty cool.
I seem to recall my machine learning course at U of C teaching us that a neural network with lots of hidden layers doesn't perform any better than a neural network with just an input and output layer. Apparently the "deep training" Google and Microsoft are doing is based on a breakthrough that was made in 2006 -- the year after I graduated. Ha!
Google is working on augmented-reality glasses, too. That might be neat!
They are also releasing some sort of mystical black orb of doom that costs $300. But what does it do? Something music-related? I'll let the designer explain:
“The sphere is a zero primitive form,” Jones says. “It transcends into this third wave of electronics where the interface, Android, is on another device. So now the actual object doesn’t have the burden of direct manipulation. It can have any presence and gesture within the room, and this encourages you to interact with it.”
Thanks, that sure clears things up. I'm sure everybody will want one.
I had been vaguely aware that there existed proprietary pre-internet networks in Europe, but I had never heard about "Minitel" until I read this article about the rise and fall of France's own government-sponsored proto-internet service, which is finally going offline this year.
Do-it-yourself surveillance drones, because why should the military have all the fun?
Germany is trying to go completely green energy, shutting down its nuclear plants while aiming to cut greenhouse gases 40% by 2020. They describe this extremely ambitious plan as "Energiewende", an energy revolution.
A new paper by a group of 22 climate scientists and ecologists summarizes what we know on tipping points in ecosystems. You may have seen this research reported on various news sites with sensationalistic and attention-grabbing headlines like "WE'RE ABOUT TO PUSH THE EARTH OVER THE BRINK".
The actual paper, as far as I can tell, does not appear to make this claim; from what I've read it seems to be more like "our ecosystem models are prone to rapid phase transitions when they cross a tipping point; the earth might be too, so watch out". They do point out that human activity uses 43% of the earth's land surface and speculate that maybe something scary will happen when we pass 50%? They don't claim to know the answer. We might be approaching a tipping point, but we don't know when, or what it will be, or what the ecosystem will look like afterwards. It's all about uncertainty and the need to know more about how the global ecosystem works.
Unfortunately, no matter how nuanced scientists try to be in their actual statements, the media always turns it into "WE'RE ALL GONNA DIE". This is a good reason to be skeptical of science reporting in mainstream media and to try to get as close as you can to the original research (unfortunately, the actual article in question is behind a paywall.)
Coding for SPACE
Software didn't dump me, I dumped software.
The love was gone; it was time to end it. Still, the relationship lasted over ten years, and the breakup has been hard. Some days I'm angry, some days I'm wistful, some days I just don't care.
I'm in the process of unwinding myself from my "computer guy" identity. It's hard. At the moment I don't know who I am. The only thing I'm sure of is that making software isn't what I want to do with the rest of my life.
Programming computers will probably be part of what I do. I have a lot of skill points invested in it, after all, and besides, everything's got a computer in it these days. But programming is just a means. The goal has to be something else, not just computers for the sake of computers.
This "Advice from an old programmer" (from "Learn Python the Hard Way") spoke to me:
Programming as a profession is only moderately interesting. It can be a good job, but you could make about the same money and be happier running a fast food joint. You're much better off using code as your secret weapon in another profession.
People who can code in the world of technology companies are a dime a dozen and get no respect. People who can code in biology, medicine, government, sociology, physics, history, and mathematics are respected and can do amazing things to advance those disciplines.
That sounds good!
Mozilla was a local maximum. Better than any similar job. Every adjacent direction was down. The only way to get a better job than my one at Mozilla is to do something completely different.
The most excited and optimistic I've felt about technology in the last several years was when I stayed up until the wee hours of the morning watching the livestream from NASA's site of the SpaceX Dragon capsule docking with the International Space Station.
The SpaceX CEO, Elon Musk, seems like a really cool guy. He made a fortune at PayPal and instead of retiring young, he put his money into space exploration and electric cars (he's also CEO of Tesla Motors). Lots of respect for that dude.
And SpaceX is hiring computer programmers to do physics simulations. I bet I could do that.
Only problem is that they're in LA, and it would be hard on Sushu for me to ask her to leave her school.
Solar engergy link roundup
I've seen a lot of news lately about technological advancements that could make solar cels a lot cheaper - either by decreasing the cost of materials, the cost of production, or the cost of installation. I'm hoping to see the price of electricity from solar become truly competitive (i.e. without government subsidies) against electricity from fossil fuels in the near future. There are inspiring signs of progress!
UC Berkeley scientists figured out a way to make solar panels out of any semiconductor, allowing them to potentially be made out of cheaper materials than the high-quality silicon currently required. For example: Copper oxide.
Some German companies are developing robots that can install solar panels, reducing the labor costs of installation.
A startup company called Twin Creeks, in San Jose, has invented a way to create solar cells one tenth the thickness of current cells by bombarding them with a hydrogen ion cannon (!). At scale, this could lead to enormous savings in raw materials.
Even more science-fictiony than the ion cannon: Vanderbilt University scientists have made a bio-hybrid solar cell by combining silicon with a photosynthetic protein. The protein, called PS1, is extracted from spinach leaves and continues to function outside of the plant, converting sunlight into electrical energy with nearly 100% efficiency (compared to ~40% for manmade devices). The bio-hybrid solar cell is still a long way from practical mass production, though.
It takes energy to make solar panels, and that energy has to come from somewhere, so this has to be factored into their lifetime cost (and environmental impact). The National Renewable Energy Laboratory has found a way to potentially cut the energy used to produce panels in half using an optical furanace to heat up the silicon substrate.
Finally, some innovations are low-tech: standardized mounting brackets that don't require specialized tools can make panel installation faster and easier, reducing labor costs. A Chinese company called Trina solar and a German company called Solon Energy have both invented designs to do just that.
Robots are gonna take all our jobs
Even if we survive the displacement from rising sea levels, and the food shortages from climate-changed induced droughts and the bee die-off, we can look forward to a future where robots have made us all obsolete:
The robot threat: In the long run, we are telepathic androids | The Economist
Assuming Moore's Law keeps churning away at its normal exponential pace, Mr Drum figures that will happen somewhere around 2040, and it will gradually make our current economic assumptions untenable: most humans will become permanently unemployable since there will be nothing they can do that a robot can't do better and cheaper, which means there will be too few consumers to create demand for the products the robots can create.
Welcome, Robot Overlords. Please Don't Fire Us? | Mother Jones
Increasingly, then, robots will take over more and more jobs. And guess who will own all these robots? People with money, of course. As this happens, capital will become ever more powerful and labor will become ever more worthless. Those without moneyâ€”most of usâ€”will live on whatever crumbs the owners of capital allow us.
Of course, this disruption is already happening. People are already losing their jobs to "robots", even though they don't look much like science-fiction robots -- they're mostly internet-connected algorithms.
There used to be a job called "video rental store clerk", for example (I used to be one) but Netflix has rendered that job obsolete. There used to be a job called travel agent, but Expedia and other airline-search websites eliminated that. And of course Google is putting a lot of research into taking away the jobs of taxi drivers and truck drivers with their driverless cars.
Jaron Lanier (author of "You Are Not A Gadget", which I highly recommend) says in an interview with Slate that The Internet Destroyed The Middle Class:
At the height of its power, the photography company Kodak employed more than 140,000 people and was worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When Instagram was sold to Facebook for a billion dollars in 2012, it employed only 13 people. Where did all those jobs disappear? And what happened to the wealth that all those middle-class jobs created?
If you think about it, software is only profitable if you can sell it to an organization. An organization is only going to buy the software if it saves them money. And how does it save them money? By letting them fire workers.
Whenver startup guys talk about "disrupting" an industry, what they mean is "we're going to fire all your workers and replace them with software, so that we -- the controllers of the software -- can be the new middlemen".
There are currently a lot of startup guys talking about "disrupting" education. Which means that teachers should be very, very afraid.
I can imagine a world where robots do all the work. In that world, capitalism and the current social contract of labor-for-wages are simply untenable. They'd have to have some other economic system for distributing the goods and services produced by all their robots. But how do we get there from here? In the short term, capitalism isn't going anywhere. And capitalism is going to ensure that technological advances continue to displace workers, while all of the productivity gains from the new technology are captured by the owners of industry.
It's a lot like what happened during the industrial revolution. If you take the very long view, you could say that the industrial revolution ended up making the economy better for everyone -- worldwide living standards and education levels and so on are higher now, and we have new jobs that are better than the old crappy jobs that were eliminated. But the long term benefit was small comfort to the people who lived through the industrial revolution and saw their jobs replaced by machines.
I'm not saying we should stop technological progress, even if we could. Instead, I think that the ongoing destruction of jobs by technological progress should be an argument for re-examining our economic system and our social contract, to try to come up with a system where the benefits of technological efficiency gains can be shared across society instead of accruing only to the top.