Thursday, July 16, 2015

Friday Thinking 17 July 2015

Hello – Friday Thinking is curated on the basis of my own curiosity and offered in the spirit of sharing. Many thanks to those who enjoy this. 
In the 21st Century curiosity will SKILL the cat.

The Knotty Objects
“Knotty Objects” are objects for which conception, design, manufacturing, use and misuse are non-linear, non-discrete. They entangle practices, processes, and policies. When successful, they transform material practice, manufacturing culture, and social constructs.

We believe that the collaboration between design and science–a reality in centuries past that was temporarily lost in the course of the 20th century, when the world thought it could understand what design was–will become an important staple of education and practice in the future. And we hope that the discussions from Knotty Objects will remain in your mind and will spark new questions and new ideas.
Knotty Objects celebrating the chimeric nature of design

WTF?! In San Francisco, Uber has 3x the revenue of the entire prior taxi and limousine industry.

WTF?! Without owning a single room, Airbnb has more rooms on offer than some of the largest hotel groups in the world. Airbnb has 800 employees, while Hilton has 152,000.

WTF?! Top Kickstarters raise tens of millions of dollars from tens of thousands of individual backers, amounts of capital that once required top-tier investment firms.

WTF?! What happens to all those Uber drivers when the cars start driving themselves? AIs are flying planes, driving cars, advising doctors on the best treatments, writing sports and financial news, and telling us all, in real time, the fastest way to get to work. They are also telling human workers when to show up and when to go home, based on real-time measurement of demand. The algorithm is the new shift boss.

WTF?! A fabled union organizer gives up on collective bargaining and instead teams up with a successful high tech entrepreneur and investor to go straight to the people with a local $15 minimum wage initiative that is soon copied around the country, outflanking a gridlocked political establishment in Washington.

What do on-demand services, AI, and the $15 minimum wage movement have in common? They are telling us, loud and clear, that we’re in for massive changes in work, business, and the economy.

What is the future when more and more work can be done by intelligent machines instead of people, or only done by people in partnership with those machines? What happens to workers, and what happens to the companies that depend on their purchasing power? What’s the future of business when technology-enabled networks and marketplaces are better at deploying talent than traditional companies? What’s the future of education when on-demand learning outperforms traditional universities in keeping skills up to date?

Over the past few decades, the digital revolution has transformed the world of media, upending centuries-old companies and business models. Now, it is restructuring every business, every job, and every sector of society. No company, no job is immune to disruption.
Tim O’Reilly The WTF Economy

...the more one learns about robots and algorithms, the less akin to us they appear. They’re not our rivals or our mind-children. They’re not our “successors” any more than smokestacks or mathematics textbooks are our successors. Militarized robots might wipe us out, but we could be wiped out by our coal mines, or our hydrogen bombs, or even by some minor virus, or an asteroid. That doesn’t make robot drones, fossil fuels, bombs, illness, or cosmic accidents our successors. They’re of a different category than us. We modern homo sapiens will most likely be done in by some of our own close hominid relatives if we play true to type over our last two million years.

Robots just don’t want to live. They’re inventions, not creatures; they don’t have any appetites or enthusiasms. I don’t think they’d maintain themselves very long without our relentlessly pushing them uphill against their own lifeless entropy. They’re just not entities in the same sense that we are entities; they don’t have much skin in our game. They don’t care and they can’t be bothered. We don’t yet understand how and why we ourselves care and bother, so we’d be hard put to install that capacity inside our robot vacuum cleaners.
Interview with Sci-Fi Author Bruce Sterling: Alien-Computer Interfaces

Here’s an interesting report on the present. Perhaps not a precise prediction of the future - but certainly of the next few years challenges.
The Outlook on the Global Agenda 2015
An analysis of the Top 10 trends which will preoccupy our experts for the next 12-18 months as well as the key challenges facing the world’s regions, an overview of global leadership and governance, and the emerging issues that will define our future.

One more experiment - with what may be an inevitable trajectory in the digital environment where everything that can be automated will be.
Why is this Dutch city giving residents free money?
Starting this fall, the Dutch city of Utrecht will begin an ambitious yearlong experiment: giving monthly checks to numerous people already on welfare, no strings attached.
The concept is known as a basic income.

It affords citizens a standard amount of money to cover expenses, ranging from major health costs to quick trips to the grocery store, on top of their other sources of revenue.

Richard Nixon gave a similar idea a try in the 1960s. A decade later, Canada conducted its own experiment. And in 2016, Switzerland is slated to hold a referendum on implementing basic income.

Basic income has yet to emerge in full force partly because of logistics and partly because of fear of abuse. But that fear may be misguided, at least in the Netherlands, according to those conducting the experiment.

MIT has developed a way to enable Big Data analysis, privacy, using the Blockchain (from Bitcoin)
MIT’s Enigma: Decentralized Cloud Platform with Guaranteed Privacy
A pair of Bitcoin entrepreneurs and the MIT Media Lab has revealed a prototype for a system called Enigma, a decentralized cloud platform with guaranteed privacy. Enigma allows users to store, share, and analyze personal data without it being fully revealed to any party. Powered by the blockchain, Enigma aims to be a secure, multi-party computation.
The team at MIT has already developed a prototype for Enigma, which is based on a highly optimized version of secure multi-party computation, guaranteed by a verifiable secret-sharing scheme. MIT Media Labs’ whitepaper says:

“For storage, we use a modified distributed hashtable for holding secret-shared data. An external blockchain is utilized as the controller of the network, manages access control, identities and serves as a tamper-proof log of events. Security deposits and fees incentivize operation, correctness and fairness of the system. Similar to Bitcoin, Enigma removes the need for a trusted third party, enabling autonomous control of personal data. For the first time, users can share their data with cryptographic guarantees regarding their privacy.”

The main principle behind Enigma is developing a peer-to-peer network that enables different parties to jointly store and run computations on data without compromising the privacy. Encryption has been used for privacy purpose successfully; however, the problem with encrypting data is that sooner or later, the users have to decrypt it. While encrypting the data, users can keep their cloud files cryptographically scrambled using a secret key that only they possess to rule out any hacking. However, as soon as they want to actually do something with those files, anything from editing a word document or querying a database of financial data, they have to unlock the data and leave it vulnerable.

Thus, in ordinary data encryption, there is still a risk from hackers. However, MIT Media Labs believes it has solved this problem with its “homomorphic encryption,” a still-mostly-theoretical advancement in the science of keeping secrets. The MIT team plans to use this new form of encryption in Enigma
Here’s the link to the Enigma White Paper:
Enigma: Decentralized Computation Platform with Guaranteed Privacy

And here’s something hinting at new forms of democratic decisioning. A two-part video presentation - Part 1 33 min and Part 2 is 48 min.
Voting Methods with Google Votes
Learn the ins-and-outs of common voting methods and algorithms for group-decision-making with illustrated examples in Google Votes. Google Votes is a Google-internal voting platform used for decisions like food selections, cafe names, t-shirt designs, and Halloween contests. This talk covers Approval, Ranked, Range/Score, and Plurality voting along with the Borda and Schulze ranking algorithms and their relation to Condorcet's Paradox.
Part 2
Liquid Democracy with Google Votes
Google Votes is an experiment in liquid democracy built on Google's internal corporate Google+ social network. A Liquid Democracy system gives all the control of Direct Democracy with the scalability of Representative Democracy. Users can vote directly or delegate power through their social networks. This talk covers user experience aspects of delegated voting and three graph algorithms for flowing votes through a social graph called Tally, Coverage, and Power.

This is an interesting article for anyone interested in the near future of how emerging technology will disrupt incumbent financial services and banking.
FinTech’s Secret Weapon Against Central Banks
“The Future of Financial Services,” a 178-page report released on June 30 by the World Economic Forum, explains one of the greatest advantages of FinTech firms. They have the ability to reduce friction by using the most innovative software and technology. Jesse McWaters, project manager of the Center for Global Industries, World Economic Forum, said:

“Today’s innovators are aggressively targeting the intersection between areas of high frustration for customers and high profitability for incumbents, allowing them to ‘skim the cream’ by chipping away at incumbents’ most valuable products. It is hard to think of a better example of this than remittance — banks have traditionally charged very high fees for cross-border money transfers and offered a poor customer experience, with transfers often taking up to three days to arrive at their destination.”

To illustrate how fast global FinTech is growing, total financing in 2008 was less than US$930 million. By October 2014, it had grown to over US$1.04 billion in that month alone. Some raise the question whether this growing ecosystem should form partnerships within the existing banking system. The real question, however, is whether the traditional financial sector can even adapt. Andreas Antonopoulos said in March at the MIT Bitcoin Expo 2015:

"Where do we go when the era of Central Banking dies? Because it's about to die — by decentralized organisms that are dynamically scalable and software that can be modified. I am very confident that software systems can and do adapt dynamically."

On the topic of banks and finance here is a different model with basic principles that have stood the test of time - and could perhaps be incorporated in the re-imagining of finance.
Working With Islamic Finance
Islamic finance refers to the means by which corporations in the Muslim world, including banks and other lending institutions, raise capital in accordance with Sharia, or Islamic law. It also refers to the types of investments that are permissible under this form of law. A unique form of socially responsible investment, Islam makes no division between the spiritual and the secular, hence its reach into the domain of financial matters. Because this sub-branch of finance is a burgeoning field, in this article we will offer an overview to serve as the basis of knowledge or for further study.

Central to Islamic banking and finance is an understanding of the importance of risk sharing as part of raising capital and the avoidance of riba (usury) and gharar (risk or uncertainty). (To see more on risk, read Determining Risk And The Risk Pyramid and Personalizing Risk Tolerance.)

Islamic law views lending with interest payments as a relationship that favors the lender, who charges interest at the expense of the borrower. Because Islamic law views money as a measuring tool for value and not an 'asset' in itself, it requires that one should not be able to receive income from money (for example, interest or anything that has the genus of money) alone. Deemed riba (literally an increase or growth), such practice is proscribed under Islamic law (haram, which means prohibited) as it is considered usurious and exploitative. By contrast, Islamic banking exists to further the socio-economic goals of Islam

Islamic finance is a centuries-old practice that is gaining recognition throughout the world and whose ethical nature is even drawing the interest of non-Muslims. Given the increased wealth in Muslim nations, expect this field to undergo an even more rapid evolution as it continues to address the challenges of reconciling the disparate worlds of theology and modern portfolio theory...

The vertical farm has been an idea around for a number of years - however, the idea is slowly being implemented.
FarmedHere: The Nation’s Largest Indoor Organic Farm Now Growing in Chicago
FarmedHere has been growing greens in the Chicago area since 2011 and has a 10,000-square-foot facility in Flanagan and a 4,000-square-foot one in Englewood, IL. Jolanta Hardej, the CEO and co-founder of the company, used to be an interior designer and a mortgage broker, but had the vision for the farm after the market collapse in 2008. The new 90,000 square foot facility in Bedford Park, to the southwest of Chicago is just a nondescript warehouse from the outside, but on the inside, it’s a lean, mean growing machine.

By using vertical farming stacking, aquaponics and aeroponics, they’ve got 150,000 square feet of growing space — almost 3.5 acres. Aquaponics produces organic herbs like basil and other greens and supports raising tilapia, while the aeroponics produces leafy greens like arugala and watercress. There are six shelves of plants growing at one time that are tended to by workers using lifts. Water is also highly conserved and according to Hardej, they use only 3 percent of the water that traditional agriculture uses, plus it’s all recycled.

Urban farming also has a number of other benefits, like employing local workers and reducing transportation and shipping. The company also hires local youths in an urban agricultural training program through Windy City Harvest. FarmedHere was awarded its USDA Organic Certification at the end of 2012 and the new facility was in part funded by Whole Foods, the farm’s largest customer. Besides Whole Foods, FarmedHere products are sold at Chicago-area Mariano’s locations,Green Grocer, and possibly soon at Trader Joe’s and Meijer.

The disruption to education is often an argument about financing learning - this may be a more than weak signal of an emerging choice around how we fund education.
After student exodus, 900 laid off by University of Phoenix
Over the past five years the for-profit institution has shed about half of its students, and since September the university has fired 900 employees.  

University of Phoenix’s parent company, Apollo Education Group, announced dismal quarterly earnings Monday after the markets closed. Its revenue and student enrollment both dropped about 14% from the same time a year ago.

Investors ditched class Tuesday morning, and Apollo’s stock tanked. Apollo’s stock has lost more than half its value — 61% — since the beginning of this year.

“We’re obviously going through a period of transformation where there is a higher than normal volatility within our sector and our company,” says Apollo Education Group CEO Gregory Cappelli said during a conference call Monday.

Cappelli projects that by next year student enrollment will be about 150,000. It’s currently 206,000. Cappelli hopes to stabilize enrollment in the coming years. University of Phoenix has closed over 100 campuses in recent years. Meanwhile, its former competitor, Corinthian College, shut down last year.

Tighter regulations on for-profits and the Obama administration’s push to make community college free top the list of headwinds. And non-profit universities have entered the online education space, where for-profit schools once held center stage.

And just wait for this generation (after the Millennials) to hit the workforce.
Televisions Are No Longer the Screen of Choice for Kids
Traditional Television Seen as Punishment; iPads Trump Dessert
A new study that examines media consumption among the footie-pajama set may provide a clue to the root cause of an ongoing ratings drought wreaking havoc on kids' cable ratings.

According to a research report from Miner & Co. Studio, televisions are no longer the screen of choice for kids who have ready access to tablets and smartphones. More than half (57%) of parents surveyed said their children now prefer to watch video on a handheld device rather than on TV.

Mobile devices are so popular with kids that nearly half of the 800 parents quizzed by Miner & Co. reported that they confiscate their kids' tablets when they act up and make them watch TV instead, thereby fostering a sort of Pavlovian response that equates TV with punishment. (That these parents simply don't restrict their kids' access to video altogether when they misbehave suggests that they're raising a generation of spoiled content junkies, but that's another story.)

Some kids are so obsessed with the small screen that they'll even forego treats for another few minutes with their portable video device. When given the choice between spending quality time with the tablet or having dessert, 41% of the parents surveyed said their kids would pick the screen over the snack.

In a video Miner cooked up to illustrate the results of the study, one child who was confronted with the tablet-or-dessert ultimatum broke it down thusly: "I know it's like, 'Whoa, why would you do the iPad!' But because it's like, one cookie? I probably can always get another cookie."

Later in the clip, we see a toddler's reaction when the tablet he'd been using is given to his sister. Let's just say the little guy is not pleased. Another kid then proffers an opinion that should chill Viacom and Disney executives to their very marrow: "Well, basically, TV sucks." Whether he was subsequently forced to watch TV as punishment for his pottymouthed musings remains lost to history.

If television is becoming disrupted what is the emerging disruptor - what’s the new TV-like platform.
Facebook Is Eating the Media. Is That Such a Bad Thing?
Wednesday, May 13, 2015: That’s the day journalism sold its soul, according to some of the more hysterical corners of the media echo chamber.

What actually happened is that Facebook launched a feature called Instant Articles. The feature allows Facebook users to read full interactive stories from the New York Times,BuzzFeed, National Geographic, and others without leaving Facebook. In the past, media outlets simply posted links on Facebook that directed readers to their own websites to read the full story. Now they’re allowing Facebook to host those same stories, which gives Facebook unprecedented control over their content.

You can see the inaugural batch of Instant Articles today by logging into the Facebook mobile app on an iOS device. (An Android version is reportedly in the works.) If your friends happen to be sharing them, they’ll appear in your News Feed just like any other Facebook post would. Otherwise you can seek them out at, although you’ll have to do it in the latest version of the mobile iOS app if you want to see what they look like. On your Web browser, they’ll just look like normal articles.

This is a short easy article about the near future of VR and AR. It elaborates the difference with a concise table - which make the issues clearer to understand.
The 7 Drivers Of The $150 Billion AR/VR Industry
Augmented Reality and Virtual Reality won’t get going properly until next year, but we’re already seeing a healthy level of competition between early market players.
While internal market competition is inevitable, the fight that really matters is between AR/VR’s installed base and the huge incumbent smartphone and tablet market (PCs and consoles enable VR, rather than competing with it).

There are more than 4 billion smartphones and tablets today, and this is set to hit 6 billion by 2020. In contrast, AR and VR hope to go from a standing start in 2016 to hundreds of millions of units in the same timeframe.

So while $150 billion AR/VR revenue by 2020 is a really big number, the trillions generated by mobile put it in perspective. AR and VR companies should have nightmares about Apple, Samsung, Huawei, Lenovo and Xiaomi, not each other.

This is very interesting - a metaphor map. Metaphors aren’t decorations overlaid on language that is literal. Metaphors structure how we reason.
Metaphor map charts the images that structure our thinking
Huge project by Glasgow University researchers plots thirteen centuries of startling cognitive connections
Metaphor is not the sole preserve of Shakespearean scholarship or high literary endeavour but has governed how we think about and describe our daily lives for centuries, according to researchers at Glasgow University.

Experts have now created the world’s first online Metaphor Map, which contains more than 14,000 metaphorical connections sourced from 4m pieces of lexical data, some of which date back to 700AD.

While it is impossible to pinpoint the oldest use of metaphor in English, because some may have been adopted from earlier languages such as Germanic, the map reveals that the still popular link between sheep and timidity dates back to Old English. Likewise, we do not always recognise modern use of metaphor: for example, the word “comprehend” comes from Latin, where it meant to physically grasp an object.

The three-year-long project to map the use of metaphor across the entire history of the English language, undertaken by researchers at the School of Critical Studies, was based on data contained in the Historical Thesaurus of English, which spans 13 centuries.
Dr Wendy Anderson, the project’s principal investigator, said that the findings supported the view that metaphor is pervasive in language and is also a major mechanism of meaning-change.

“This helps us to see how our language shapes our understanding – the connections we make between different areas of meaning in English show, to some extent, how we mentally structure our world”, she said.
Here’s the link to the Metaphor Map:

In the territory of maps - this is a wonderful interactive data visualization - worth the view.
PRB’s Digital Visualization highlights key global demographic trends. Explore current and projected population by region and country. And look at changes in total fertility, infant mortality, and life expectancy since 1970. A U.S. “What-If” scenario focuses on the effects of race and ethnicity on child poverty, child obesity, and college degrees.

Also check out PRB’s 2014 World Population Data Sheet, interactive map, and DataFinder.

This is an interesting possibility hinting at the looming shift in transportation paradigm. The possibilities of autonomous self-driving cars as an enhancement of mass-transit - and a shift from private ownership of cars to subscription to access transportation.
Have we really reached 'peak car'?
Vehicle traffic grew at a fearsome rate worldwide for decades … until 2007. Then came the perfect storm of an economic collapse, a digital revolution and major changes to urban lifestyles. But is this just a blip?
A funny thing happened on the way to Carmageddon: the predicted traffic failed to show up. As engineers continued to forecast traffic growth in line with historic averages – up, up and yet farther up, to an eventual “carpocalypse” – actual traffic not only fell short of projections, in many places it just plain fell. A growing number of researchers and commentators are now suggesting that we’ve reached “peak car”, the point at which traffic growth stops, and potentially even falls on a per capita basis.

Total vehicles miles travelled (VMT) has been outpacing population and jobs for decades, across industrialised countries.
Here’s another article on the same topic - this one by Thomas Frey and has some different graphs.
The Coming of “Peak Car”
In what year will the number of cars in the world reach its peak and auto sales overall begin to decline?
For most, it may be surprising to realize we’re already there in the U.S. Growing data shows many wealthy economies have already hit “peak car,” a point of market saturation characterized by an unprecedented deceleration in the growth of car ownership, total miles driven, and annual sales.

Vehicle traffic grew at a staggering rate worldwide for decades. But that all changed in 2007. Some refer to it as the perfect storm with the combination of economic collapse, digital revolution, and major shifts in urban lifestyles.

Several alternative transportation startups also began in that timeframe led by the likes of Zipcar, Uber, Lyft, and SideCar. This was followed by the emergence of connected cars, growing electric vehicle markets, driverless cars, declining birthrates, and increasingly congested highways in virtually every major city in the world.

Mounting indicators are painting a clear picture of an automobile industry only a few years away from reaching the top of the bell curve in the rest of the world as well.

Maybe this will mitigate the ‘Peak Car’  projection - this is an short opinion piece that was recommended by Kevin Kelly.
The Electric Car
The electric car is going to take over the world.  Soon.  Let me explain.
75% of US consumers and over 85% of US millennials own smartphones.  Perhaps more amazing is that 1/4 of people in the world use a smartphone today.  Ten years ago a prediction that this would be the future would have been met with scorn or laughter. In fact, in 2005 few if any of the futurists would have even been able to imagine the kind of device most of us now depend upon.  Naturally, the release of the iPhone in 2007 changed everything, but it is likely that the smartphone era was inevitable. Steve Jobs just ushered it in a few years early.

In June of 2012 Tesla released the Model S and the results will be equally transformative.   Current predictions of the future of electric cars are as wrong as any predictions about the future of mobile phones made in 2005.  It is likely that electric car penetration, at least in the US, will take off at an exponential rate over the next 5-10 years rendering laughable the paltry predictions of future electric car sales being made today.  

These predictions are so wrong because they misunderstand the pertinent forcing function. Their assumption is that electric car sales will slowly increase as the technology gets marginally better, and as more and more customers choose to forsake a better product (the gasoline car) for a worse, yet  “greener” version.

...cheaper Teslas will appear and begin to take more and more market share and soon three trends will drive the exponential increase in electric cars on the road.
  • All electric cars will become cheaper and cheaper.
  • The range of these cars will soon match or exceed that of gasoline cars.
  • Gas stations will start to go out of business as many more electric cars are sold, making gasoline powered vehicles even more inconvenient.

The last of these is actually vitally important to understand.  Gas stations are not massively profitable businesses. When 10% of the vehicles on the road are electric many of them will go out of business.  This will immediately make driving a gasoline powered car more inconvenient.  When that happens even more gasoline car owners will be convinced to switch and so on.  Rapidly a tipping point will be reached, at which point finding a convenient gas station will be nearly impossible

And talking about energy and the looming impact of cheap, ubiquitous electricity.
Denmark Just Produced 140% Of Its Electricity Needs Via Wind Power
On Thursday, high winds allowed Denmark to meet all of its electricity needs, with plenty to spare for neighboring countries.

According to the chief commercial officer of the Ecofys energy consultancy, Kees van der Leun, a surge in wind farm installations could allow Denmark to produce half of its electricity from renewable sources by 2020 – that’s only five years away.

Not everyone is likely to be happy about this news, however. As The Guardian shares, the British wind industry is likely to view the Danish achievement with envy. This is no doubt because David Cameron’s government announced a withdrawal of support for onshore wind farms from next year, and planning obstacles for onshore wind builds.

Is this the future of learning - of collective intelligence or is this the real next ‘Artificial Intelligence that transcends language? The human-human-machine-interface? This is a worthwhile read.
Animal brains connected up to make mind-melded computer
Two heads are better than one, and three monkey brains can control an avatar better than any single monkey. For the first time, a team has networked the brains of multiple animals to form a living computer that can perform tasks and solve problems.

If human brains could be similarly connected, it might give us superhuman problem-solving abilities, and allow us to communicate abstract thoughts and experiences. "It is really exciting," says Iyad Rahwan at the Masdar Institute in Dubai, UAE, who was not involved in the work. "It will change the way humans cooperate."

The work, published today, is an advance on standard brain-machine interfaces – devices that have enabled people and animals to control machines and prosthetic limbs by thought alone. These tend to work by converting the brain's electrical activity into signals that a computer can interpret.

Miguel Nicolelis at Duke University Medical Center in Durham, North Carolina, and his colleagues wanted to extend the idea by incorporating multiple brains at once. The team connected the brains of three monkeys to a computer that controlled an animated screen image representing a robotic arm, placing electrodes into brain areas involved in movement.
By synchronising their thoughts, the monkeys were able to move the arm to reach a target – at which point the team rewarded them with with juice.

Then the team made things trickier: each monkey could only control the arm in one dimension, for example. But the monkeys still managed to make the arm reach the target by working together. "They synchronise their brains and they achieve the task by creating a superbrain – a structure that is the combination of three brains," says Nicolelis. He calls the structure a "brainet".

This is a nice advance in the continuation of Moore’s law.
IBM and allies advance chip technology two generations into the future
A prototype chip has quadruple the circuitry and double the performance of today's cutting-edge chips. This kind of work keeping Moore's Law ticking hastens the day your smartwatch has a lot more brains.
Chip designs two generations more advanced than today's cutting-edge designs are now closer to reality as IBM announced Wednesday it's built a test processor that makes computer circuitry significantly more powerful.

The test chip has working components, called transistors, but it is a research and development project rather than a finished product that can be built into a computing device like a laptop, server or smartphone. Nevertheless, it's an important step extending Moore's Law and its promise of steady progress in the computer industry.

The processor progress charted by Moore's Law has shrunk computers from refrigerator-sized hulks to smartphones that fit in your pocket. But it's getting harder to develop each new generation of chip technology, requiring years of materials research and manufacturing facilities costing about $10 billion. IBM's work signals that it'll be feasible to miniaturize chips further, helping to enable devices like powerful smartwatches or perhaps augmented-reality contact lenses.

"This is a welcome sign for the chip industry," said Envisioneering analyst Richard Doherty. "You can count on at least two more turns of Moore's Law benefits."

New chips and new ways to them to learn. This article points to very interesting possibilities of providing deep learning computers with the large data sets they need to learn and maybe even learn how to learn.
Minecraft Shows Robots How to Stop Dithering
A new approach to robot learning was tested in Minecraft, the popular open-ended computer game.
The computer game Minecraft, which depicts a world made up of retro, pixelated blocks that can be modified and rearranged in endless architectural configurations, has been praised for teaching young players about creativity, problem solving, and survival skills (in certain modes you have to avoid threats including zombies). Well, it turns out even inexperienced robots can learn a thing or two by playing the game.

Stefanie Tellex, a professor at Brown University, is using Minecraft, as well as real-world machines, to explore ways for robots to solve new problems faster and more efficiently. This isn’t something most robots need to do, since they work in a fixed environment, performing work that has been carefully programmed beforehand. But it could be important as robots start to take on more complex, open-ended tasks in less structured settings. A robot designed to help around the home, for instance, would need figure out how to perform different chores.

“You might tell a robot ‘Make me coffee,’ but the next minute you might say ‘Do the laundry,’” says Tellex. “In this context, where you don’t know the goal in advance, there’s this planning problem. Finding the sequence of actions that’s going to work in this particular environment is very challenging. Our approach is about learning that faster.”

Bruce Sterling is an amazing journalist of the emerging digital environment, science fiction writer and great futurist. He often sound deeply skeptical - yet he’s a fantastic observer of the world where art and technology merge. This is both fun and informative about the boundaries in the projects of imagining the future.
Interview with Sci-Fi Author Bruce Sterling: Alien-Computer Interfaces
for example
AM: Would we understand aliens?
BS: Well, it helped me to understand “Hollywood interfaces” when I learned more about the genuine user interfaces that actually make Hollywood movies. A lot of Hollywood is about Hollywood, and a lot of “special FX” is about special FX technology.

For instance, take the famous scene in Blade Runner (1982), in which Harrison Ford analyzes a photo. This scene is obviously of keen, heartfelt interest to the guys in the “Blade Runner FX Lab.” Similarly, the tech guys doing FX for Minority Report (2002) are really licking their chops over that gesture-based Oblong Industries interface, which, by the way, exists in Los Angeles today and is used mostly to make cinema.

What you’re seeing in those film sequences is digital imaging professionals projecting their own power fantasies to a large popular audience. That’s pretty much bound to be interesting to scholars of technology. It really is diegetic too, since the Minority Report interface is used again and again to clue people in about gesture recognition technologies. They still don’t know how it works, but it helps to persuade them that it’s coming.

Prometheus (2012) tried to address understanding aliens, but only because the aliens were super previous humans, and the thing that learned how to communicate with them was an android that had studied and mastered all previous human languages—and the aliens still tore its head off! Also, District 9 (2009) dealt with this topic when the human hero sat in an alien’s seat that just happened to fit his rear end and was able to use the controls. Marvelous coincidence, no? Or was the alien technology so advanced it could anticipate weird users, like humans?

Even though I’m a science fiction writer, I’m not too worried about our insulting space aliens by misrepresenting them and their hardware in our films. It’s not like they’ll complain to Amnesty International.

“Realistic” aliens—like the postulated inhabitants of Jupiterian gas-bag planets—aren’t going to show up in Hollywood movies because they’re hard to dramatize on a movie screen. What kind of dramatic dialogue is the screenwriter going to pen for an imaginary creature that lives for millennia, breathes liquid ammonia, and is a couple of kilometers across? Even if you worked all that out in exquisite extrapolatory sci-fi detail, the audience would just leave the theater when shown it. It’s not proper movie material; the viewers would find it abstract and dull. It’s like asking the audience in the film Avatar (2009) to identify with the mystic magic tree rather than all the sexy blue people.

Hal Clement’s science fiction novel Mission of Gravity has some pretty good alien invention in it. These aliens live under massive gravity, so they’re basically intelligent centipedes. If you’re into the imaginative invention of severely alien life forms, Hal Clement did a bang-up job there. No Hollywood producer is going to finance a movie about intelligent centipedes. Because they’re creepy and genuinely alien, they don’t have that Hobbit-style huggability that commands a wide audience.

However, one might do a pretty good five-minute Vimeo style FX film about intelligent alien centipedes. It would have a niche audience, but it would also have a niche cost. This is one reason why “design fiction” is thriving in short online movies today. Nowadays we’ve got the technical capacity to visually dramatize truly startling things, effectively, on a small scale.

Here’s an interesting link to MIT design Fiction site with a long list of projects. An interesting approach to foresight thinking.
Sparking imagination and discussion about the social, cultural, and ethical implications of new technologies through design and storytelling.
The Design Fiction group explores how to spark imagination and discussion about the social, cultural, and ethical implications of new technologies through design and storytelling. The group also explores alternative ways to encourage debate using social/viral media and popular culture.

This is not ready for primetime - but does point to a trajectory for thinking about the future of water - which won’t be what it was in the past.
Nano Water Chip Could Make Desalination Affordable for Everyone
With freshwater declining throughout the globe, desalination looks increasingly attractive, but current technologies are expensive, demand far too much energy and are prone to contamination. Now researchers from the University of Texas at Austin and the University of Marburg in Germany have developed a “water chip” that creates a small electrical field that separates salt from seawater. The technology, which is still under development and works at the nano-scale, uses so little energy it can run off a store-bought battery!

The researchers apply a 3.0 volt electrical charge to the plastic water chip, which has a microchannel with two branches. By creating an “ion depletion zone” with an embedded electrode that neutralizes chloride ions, they are able to redirect the salts in the water down one channel, while the fresh water goes down another.

“Like a troll at the foot of the bridge, the ion depletion zone prevents salt from passing through, resulting in the production of freshwater,” the team wrote in a recent press release.

Less energy-intensive than current desalination plants, the water chip doesn’t rely on a membrane, and can be made portable so that just about anybody living near the sea can purify their own water at home.

Currently the technology purifies just one nanoliter at a time and only has a 25% efficiency rate, but the team is confident that their proof of concept can be first improved and then scaled up.

Here’s something I can definitely resonate with.
People Who Are Messy Aren’t Lazy, They’re Imaginative And Bold
We live in a very formulaic and predictable world. Almost everything is neatly packaged and systematized. Society perpetually seeks to maintain order, in every sense of the word.
But it’s all an illusion. We have been taught to value superficial notions of symmetry.

Organization is a comfort pillow that lies to us and tells us life isn’t really the random, chaotic mess we secretly know it to be.
In our attempts to establish order, we often create disorder. When we buy new clothes or shoes to appear put together in public, for example, our closets begin to overflow.
When we throw away trash, it goes to a landfill and contributes to pollution.

research conducted by Kathleen Vohs, PhD, of the University of Minnesota Carlson School of Management, found that cluttered environments help induce greater levels of creativity.
In one of the experiments conducted for this study, Vohs split up a group of 48 participants and asked them to find new ways to utilize a ping pong ball. One half was placed in a tidy room, the other half in a messy room.

In the end, both groups came up with the same number of ideas, but the ideas produced by those in the untidy room were determined far more innovative by a panel of independent judges. As Vohs puts it:
Being in a messy room led to something that firms, industries and societies want more of: creativity.
Disorderly environments seem to inspire breaking free of tradition, which can produce fresh insights.
Orderly environments, in contrast, encourage convention and playing it safe.

No comments:

Post a Comment