Thursday, July 5, 2018

Friday Thinking 6 July 2018

Hello all – Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.)  that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.

Many thanks to those who enjoy this.

In the 21st Century curiosity will SKILL the cat.

Jobs are dying - Work is just beginning. Work that engages our whole self becomes play that works. Techne = Knowledge-as-Know-How :: Technology = Embodied Know-How  

“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9

Content
Quotes:

Articles:



Modern AI: Statistical Correlation and Radical Empiricism
In the past 20 years, some significant infrastructural developments have emerged that allow for a fundamentally different type of AI.

We now have access to large data sets from which to build statistical models (internet), along with the infrastructure to support mass amounts of data (cloud storage). Equipped with paradigms to compute over large data (MapReduce, Spark), statistical and Bayesian algorithms (research labs and universities), and computational resources to run iterative algorithms (Moore’s Law, GPUs, and increasingly custom FPGA/ASICs dedicated towards optimizing machine learning calculations), the modern Artificial Intelligence begins to emerge: one rooted in statistical correlation.

Statistical correlation is based on implementing learning models for a computer to apply to large datasets. These statistical models are shaped by empirical performance and represent a correlation-driven representation of historical learnings. We can now assume that given the right datasets, modern AI classifiers can generate high-performing and predictive models without any dependency on the underlying principles that govern a problem.

An Epistemological Undoing of Intelligence
The modern statistical approach towards AI is a fundamental undoing of the way that we think about problems and what it means to solve a problem. It is now as much about philosophical choices in the semantics of learning and knowing as it is about feature selection and classifier choice.

The Path Forward: Questions, not Answers
AI now has the exciting potential to advance fields where our own grasp of first principles is at best incomplete: biology, pharmaceuticals, genomics, autonomous vehicles, robotics, and more. Progress in these disciplines will, and already are, transforming our society in dramatic ways and at an accelerating pace.

As this technology continues to develop, it will also generate challenging questions around the key topics of attribution and explainability.

The New Intelligence




In shallow play, the game plays you, and you maximize utility in some rational means-ends sense. You play the game as a finite game, the game plays you as a finite being. You play to win, and the game wins you.

In deep play you play the game, and you do so to renew your sense of self, reacquaint yourself with the characteristic nature of your internal how/why entanglement, thereby figuring out how to continue playing. Or eigenplaying.

Your eigenvalues and eigenvectors




The Human Genome Project as a great example of speculative exuberance being necessary for innovation.
Over the past several weeks in Snippets, we’ve been talking about the emerging practice of Synthetic Biology, how it fits into a longer story arc of innovation in modern molecular biology, and how it’s similar but also different to the software and internet boom. As we’ve been previewing for several issues now, one of the biggest points of comparison between molecular biology and the software industry has been the plummeting costs of one of the critical inputs that we saw in both. In the IT industry we called it Moore’s Law, and in biology it’s the plummeting cost of genomic sequencing that has actually outpaced Moore’s law to a remarkable degree.

Over the last twenty years, the cost of sequencing a human genome has dropped from over $100 million to under $1,000: that’s nearly three times as fast a rate of compounding as Moore’s Laws famous doubling of performance per dollar every 1.5 to 2 years. (To put that into perspective: Moore’s Law is considered one of the modern miracles of compounding technological dividends, and anyone who understands compound interest knows what it’s like to fight for single digit percent improvements in annual gains. A 300% improvement over what is considered the canonical example? That’s ridiculous.)

Moore’s Law didn’t just simply “happen”: it was an outcome of decades of prior investment, subsidy, and speculation, which set the stage for a combination of technological innovation and market forces that made the modern software industry possible.

In biology, no financial bubble was responsible; instead, it was a different kind of speculative exuberance: an enormous, coordinated investment of billions of dollars of financial capital plus an arguably even larger amount of human capital over fifteen years into an initiative called the Human Genome Project.

Eighteen years later, we now see clearly that both the data generated through the project and the equipment that was left behind both ended up being largely squandered, but we got something far more valuable instead. As Church calls it, the project created an ‘overhang’ of nascent science and technology which led to Next Generation Sequencing (which uses light instead of chemical reactions as a data transfer mechanism and can sequence DNA orders of magnitude faster than the older methods, at fractions of the cost), which thereby led to the plummeting costs and explosion of data that left the original project’s output in the dust.

The Human Genome Project: April 15, 2018





the breakthrough came when we realized that we should not assume water is incompressible. Now that we understand what's happening in the computer simulations, we are able to reproduce this phenomenon in theoretical calculations."

The scientists undertook this study to test new methods in graphene-nanopore DNA sequencing. Over the last couple of years, graphene nanopores have shown tremendous promise for inexpensive DNA sequencing. The way it works, DNA is suspended in water and then the DNA, water and ions are pulled by an electric field through a tiny hole in a graphene membrane. The electric field applied across the graphene sheet attracts the dissolved ions and any charged particles--DNA is a negatively charged particle. The DNA's four nucleobases are read as the differences in the flow of ions that each distinctively shaped nucleobase produces.

Water compresses under a high gradient electric field




For anyone serious about foresight studies, the future, futures literacy - this is a just released, freely available book - edited by a Canadian - Riel Miller. Riel is brilliant and the book is a must read for serious futurists.

Transforming the Future (Open Access)

Anticipation in the 21st Century

This book - exemplifies UNESCO as the laboratory of ideas for the United Nations, raising new questions today by changing our understanding of tomorrow. There can be no assurances that the choices we make today will create a better tomorrow – but we can become better able to harness our imagination to grasp the potential of the present and craft ways to act that are consistent with our values. This book opens a new field for innovation in exploring how humanity can live better with the uncertainty and creativity of a complex evolving universe for the benefit of all.


This may be a very important signal - of the reclamation of the Web - and more progress toward decentralized, distributed approaches to the Internet - one that aims at preventing the enclosure of the web by corporate platforms. This project is led by the inventor of the Web Tim Berners-Lee.
What made the Web powerful, and ultimately dominant, however, would also one day prove to be its greatest vulnerability: Berners-Lee gave it away for free; anyone with a computer and an Internet connection could not only access it but also build off it. Berners-Lee understood that the Web needed to be unfettered by patents, fees, royalties, or any other controls in order to thrive. This way, millions of innovators could design their own products to take advantage of it.

Solid

Solid (derived from "social linked data") is a proposed set of conventions and tools for building decentralized social applications based on Linked Data principles. Solid is modular and extensible and it relies as much as possible on existing W3C standards and protocols.


Here’s an alternative to Youtube - a decentralized, distributed version of video sharing and participation.

PeerTube

PeerTube isn’t a single video hosting platform with a single group of rules: it’s a network of dozens of interconnected hosting providers, and each provider is composed of different people and administrators. You don’t like some of the rules? You’re free to join the hosting provider of your choice, or even better, be your own hosting provider with your own rules!


This is a 26 min must see video - that presents a real world effort to implement a new economic model for a more progressive democratic society.

Building the Democratic Economy, from Preston to Cleveland​

Two forms of government have dominated in the west over the last hundred years. In one big power is vested in the state, the government, in the other policy is dominated by the influence of big industry, big corporations, or big money. Well a hundred years after the Russian revolution, and ten years after the financial crash, a whole lot of people all around the world are saying “are there any alternatives?” especially as neither of those models has delivered on a promise of shared prosperity. In Preston, Lancashire, England, a formerly industrial city, the birthplace of the industrial revolution in many ways, they’ve seen ten years of austerity, and partly out of need, and partly out of aspiration they’re practicing, experimenting, with a new model.

They’re calling it the Preston model of community wealth building, and it’s inspired by a model in another formerly industrialized city: Cleveland, Ohio, the Evergreen Cooperative model. On today’s program a transatlantic experiment in cooperative community wealth building.


This is a good signal of the ongoing transformation of our ideas about organizations and the future of how we will work. Some of this will sound familiar to anyone who has heard me talk.

IFTF and Google Cloud Create Cutting Edge Guide to Building a Resilient 21st Century Company

Institute for the Future and Google Cloud released a guide to help companies of all sizes move their organizations into the digital age by taking a holistic and systematic view of the impact of emerging technologies on the future of work, organizations and IT leadership.

The guide, in the form of a map called “Beyond Organizations: New Models for Getting Things Done,” is part of a research series exploring how new ways of working will change the pace of business and innovation, empowering organizations to be more diverse, dynamic and distributed.
This map provides leaders with the tools needed to seamlessly transition and thrive in distributed and fluid organizational environments.”

The map outlines:
The three major emerging technologies enabling new ways of organizing
The seven powerful transformations in organizational processes
The five skills individuals will need to thrive in the future
Here is the Link to the pdf of the map

Resource Allocation - From Managers too Processes
Planning - From periodic strategic plans to continuous feedback loops
Recruitment - From resumes to reputations
Synchronization - From co-located to distributed
Compensation - From money to portfolios of incentives
Boundaries - from closed to open
Scaling - from staff to networks of contributors


Accelerating advances in technology include software - here’s something that may have significant impact on AI

'Breakthrough' algorithm exponentially faster than any previous one

Computer scientists at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a completely new kind of algorithm, one that exponentially speeds up computation by dramatically reducing the number of parallel steps required to reach a solution.

The researchers will present their novel approach at two upcoming conferences: the ACM Symposium on Theory of Computing (STOC), June 25-29 and International Conference on Machine Learning (ICML), July 10 -15.

Take this toy example:
You're in the mood to watch a movie similar to The Avengers. A traditional recommendation algorithm would sequentially add a single movie in every step which has similar attributes to those of The Avengers. In contrast, the new algorithm samples a group of movies at random, discarding those that are too dissimilar to The Avengers. What's left is a batch of movies that are diverse (after all, you don't want ten Batman movies) but similar to The Avengers. The algorithm continues to add batches in every step until it has enough movies to recommend.
This process of adaptive sampling is key to the algorithm's ability to make the right decision at each step.

"Traditional algorithms for this class of problem greedily add data to the solution while considering the entire dataset at every step," said Eric Balkanski, graduate student at SEAS and co-author of the research. "The strength of our algorithm is that in addition to adding data, it also selectively prunes data that will be ignored in future steps."


Data storage that is long term and robust may soon be a biological and very personal domain.

Storing information on DNA is now cheap enough to be viable

We need to own up to the fact that we’ve become digital hoarders running out of room for our data.

In 2016, humans collectively generated 16.1 trillion gigabytes of digital information; that annual number is expected to increase by more than tenfold by 2025. Our personal pictures, texts, and emails are just a drop in the bucket; the real deluge comes from the scientists creating vast swaths of information as they run experiments and clinical trials looking deep into the smallest components of biology, and observe other planets, looking as deeply into the vast universe as is possible. And the places we currently put that data—external hard drives and cloud server rooms, for the most part—aren’t a perfect solution. They take up a lot of space, and need an upgrade every decade or so.

Biotech startups are looking within to solve the problem. Specifically, within our cells.


Blockchain - Distributed Ledger technologies continue to gain credibility.
“Blockchains will become increasingly critical to doing business globally,” said Boneh, the Rajeev Motwani Professor in the School of Engineering, and an expert on cryptography and computer security. “Stanford should be at the forefront of efforts to improve, apply and understand the many ripple effects of this technology.”

Stanford computer scientists launch the Center for Blockchain Research

The new center will address blockchain’s practical, legal and societal challenges, and develop a curriculum to facilitate its use across a variety of fields and applications.
Stanford computer scientists have founded the Center for Blockchain Research, an initiative dedicated to researching and understanding a technology that promises to fundamentally change how people and companies make deals and complete financial transactions over the internet.

The center will bring university scientists and industry leaders together to develop best practices for this burgeoning and potentially transformative field. In addition to research, center scientists are creating courses to help future students and working professionals use blockchain to develop financial instruments, protect intellectual property, manage vital records and more.


Here’s is one possible business case for the power of distributed ledgers.

From diamonds to recycling: how blockchain can drive responsible and ethical businesses

Enabling supply chain transparency
Provenance is crucial for transparency in the business environment. The lack of being able to identify and track an asset’s provenance, caused by opaque supply chains, is detrimental as there is no way to determine its authenticity. This may lead to significant financial losses and cost businesses considerable goodwill.

As assets move along the supply chain, the use of blockchain allows them to be tracked permanently. The transactional data proves the provenance and is accessible to all relevant parties from the one single orchestrated source of information.

A visible and auditable trail is created, ensuring full transparency to create an ecosystem of trust amongst stakeholders. This traceability solution can be applied across numerous industries globally.

Identifying this innovative frontier and developing its full potential will create new business models. Full transparency provides confidence in legitimate operations, enabling greater access to finance, and exposes illicit practices. Combining other technologies can extend its use beyond data storage along the supply chain.


The acceleration of the phase transition of energy geopolitics is stunning.
... it confirms that solar is now well and truly the cheapest form new energy generation, and is well under the cost of wholesale electricity in Australia – even with the cost of storage added.
Kay’s comments were echoed by Leonard Quong, an analyst with Bloomberg New Energy Finance, who told the conference: “The costs are coming down faster than I can update my charts.”

Australia solar costs hit “extraordinary” new lows – $50s/MWh

The cost of building new large-scale solar energy generation in Australia has fallen to an “extraordinary” new low, the head of the Australian Renewable Energy Agency has said, citing industry reports of numbers down around the $50/MWh mark.

Australia’s PV price plunge has seen the cost of utility-scale solar fall from around $135/MWh when ARENA launched its first auction in 2015, to “somewhere in the $50s” today, or $1/W, ARENA chief Ian Kay said on Wednesday.

“So we’ve gone from $1.60/W …. to, I think you’ll actually find that people are now talking about, by the next round of projects that are being developed, due to be financed in the next 12 months, bidding under $1/W,” Kay told the Large-scale Solar & Storage conference in Sydney, co-hosted by RenewEconomy.


This is one more signal of the accelerating trend toward a phase transition in energy geopolitics - based on near zero marginal cost energy.
Atlanta is among more than 70 U.S. cities that have adopted a 100 percent renewable electricity goal, according to a tally by the Sierra Club. That number has more than doubled in the last year as mayors and cities have reacted to President Donald Trump's announcement that he was pulling the United States out of the 2015 Paris climate agreement.

Atlanta Charts a Path to 100 Percent Renewable Electricity

A City Council committee is considering three proposals. The goals: Make this Southern city a leader in renewable power and fight climate change.
If Atlanta can get to 100 percent clean electricity, then any city can, Al Gore said. Now that signature Southern city in a deep red state has a plan to do just that.
On Tuesday, city officials took their new road map for a greener future to the Atlanta City Council, outlining options they say can fight climate change, improve health and bolster the economy all at once.

They initially planned to recommend giving the city until 2050 to meet the goals. That would have been 15 years slower than the pace the council agreed to a year ago, but city officials wanted more time to make the kinds of changes needed for a homegrown energy transformation, rather than relying on buying credits from wind farms beyond Georgia's borders.

At Tuesday's council committee meeting, they shifted their recommendation back to 2035, after what a spokeswoman described as further review and consultation.


One more important signal in the accelerating phase transition to a new energy geopolitics
PG&E's procurement request is also a "landmark event" because the utility would own the 182.5 MW, 730 MWh project it is building with Tesla. "I imagine this is the largest utility-owned, non-hydro, storage project in the world by far," Eller said. "Essentially, I see this as one of our largest utilities recognizing that the technology risk of battery storage is minimal and they have full faith that it can deliver the promised benefits at a competitive cost."

PG&E to replace 3 gas plants with world's biggest battery projects

Pacific Gas and Electric (PG&E) late last week requested approval from the California Public Utilities Commission (CPUC) for four energy storage projects totaling about 2,270 MWh.

The CPUC authorized PG&E to issue a solicitation for energy storage projects to replace three power plants that would otherwise require reliability must-run (RMR) contracts.

PG&E selected offers of three energy storage projects from third-party owners, totaling 385.5 MW, 1,540 MWh, and one 182.5 MW, 730 MWh project the utility would own.


Google is a leading pioneer in the self-driving car and also pioneering the flying car.

We took an exclusive ride in a flying car

"It's as easy to use as playing Minecraft," Kitty Hawk CEO Sebastian Thrun said as we watched my colleague Rachel Crane pull on a motorcycle helmet.
Rachel and I had just flown into Las Vegas for an exclusive first look at the Silicon Valley single-seat flying machine, Flyer.

Kitty Hawk, funded by Google cofounder Larry Page and led by Thrun, a self-driving car pioneer, attracted nationwide attention when it teased its Flyer prototype last year.

"The joystick is so intuitive, but it's not the most comfortable thing I've ever sat in," she told me later of the driver's seat. "You definitely feel the vibrations."

Flyer makes it easy to hover in place even when a human pilot isn't touching the controls. There are no complex controls, or instruments clusters or screens to monitor. The Kitty Hawk team tested everything from a steering wheel to video game controllers and boat throttles to find a design people would feel most comfortable using.


The learning curve of AI driven robotics - not only can one robot learn from observation - but what one robot learns - all robots learn. The gifs are worth the view.

One-Shot Imitation from Watching Videos

Learning a new skill by observing another individual, the ability to imitate, is a key part of intelligence in human and animals. Can we enable a robot to do the same, learning to manipulate a new object by simply watching a human manipulating the object just as in the video below?

Such a capability would make it dramatically easier for us to communicate new goals to robots – we could simply show robots what we want them to do, rather than teleoperating the robot or engineering a reward function (an approach that is difficult as it requires a full-fledged perception system). Many prior works have investigated how well a robot can learn from an expert of its own kind (i.e. through teleoperation or kinesthetic teaching), which is usually called imitation learning. However, imitation learning of vision-based skills usually requires a huge number of demonstrations of an expert performing a skill. For example, a task like reaching toward a single fixed object using raw pixel input requires 200 demonstrations to achieve good performance according to this prior work. Hence a robot will struggle if there’s only one demonstration presented.


Despite recent change in projected statistics that are the consequence of the opioid crisis - life expectancy in most of the world continues to increase.
Anyone who studies the limits of longevity faces two major statistical challenges.
There aren’t very many people who live to advanced ages, and people that old often lose track of how long they’ve actually lived. “At these ages, the problem is to make sure the age is real,” said Dr. Barbi.

How Long Can We Live? The Limit Hasn’t Been Reached, Study Finds

The mortality rate flattens among the oldest of the old, a study of elderly Italians concludes, suggesting that the oldest humans have not yet reached the limits of life span.
Since 1900, average life expectancy around the globe has more than doubled, thanks to better public health, sanitation and food supplies. But a new study of long-lived Italians indicates that we have yet to reach the upper bound of human longevity.

“If there’s a fixed biological limit, we are not close to it,” said Elisabetta Barbi, a demographer at the University of Rome. Dr. Barbi and her colleagues published their research Thursday in the journal Science.

The current record for the longest human life span was set 21 years ago, when Jeanne Calment, a Frenchwoman, died at the age of 122. No one has grown older since — as far as scientists know.


The ongoing efforts in the many domains involved with domesticating DNA aren’t limited to biological fields - here’s a good signal of the emergence ever blurring borders that enable living systems to embrace many new forms of enhancements. The question continues to become ever more salient - what is a living system and where does it end?
The advance is notable because Sun and his colleagues were able to demonstrate that their robots work in animals. “It’s really uncertain how to make these tiny machines move in living organisms,” says Bradley Nelson, a microroboticist at ETH Zürich, who was not involved in the project.

Microbots Deliver Stem Cells in the Body

Magnetically-controlled microrobots gently carry cells to hard-to-reach organs
The astonishing thing about stem cells is that they can be coaxed, in the laboratory, into becoming nearly any kind of cell—from bone marrow to heart muscle. That remarkable capability has for years kept scientists busy tinkering with stem cells and injecting them into animal models in an attempt to grow and replace damaged tissue.  

Researchers typically deliver stem cells via injection—a needle. But that method can damage healthy tissue, especially when the target is a deep brain structure, or delicate vasculature, or the inner ear.

A group out of Hong Kong announced this week that they had invented a new delivery tool using tiny, magnetically-controlled robots. The cell-carrying machines move noninvasively through the body to a target site and deliver their stem cell cargo.

The researchers, led by Dong Sun, a professor at City University of Hong Kong, demonstrated their device in zebrafish and mice, and reported their success in the journal Science Robotics.


The speed of evolution of bacteria can be very difficult for us to grasp - and can aid us and compel us to continue the efforts to domesticate DNA
“It looks like it breaks it down into harmless by-products that don’t do any environmental damage, so right now what it’s doing is breaking down the hydrocarbons within the plastic, and then the bacteria is able to use that as food and fuel,” she said. “So essentially it’s using that to live. It’s essentially turning plastic into food.”

Plastic-eating bacteria discovered by student could help solve global pollution crisis

A student may have found a solution to one of the world’s most urgent environmental crises – breeding bacteria capable of “eating” plastic and potentially breaking it down into harmless by-products.

The microbes degrade polyethylene terephthalate (PET) – one of the world’s most common plastics, used in clothing, drinks bottles and food packaging.
It takes centuries to break down, in the meantime doing untold damage to its surroundings.


Here’s a signal of an inevitable trajectory as the human-digital-environment interface becomes entangled and the technologies become accessible as DIY.

Biohacker Meow-Ludo Disco Gamma Meow-Meow who implanted Opal Card into hand escapes conviction

Mr Meow-Meow, 33, pleaded guilty to using public transport without a valid ticket and for not producing a ticket to transport officers.
In March, he was fined $220 for breaching the Opal Card terms of use and was ordered to pay $1,000 in legal costs.

Mr Meow-Meow appealed against the conviction in the District Court and today it was quashed.
District court judge Dina Yehia took into account his good character, while describing the case as "highly unusual … involving a unique set of circumstances."

Judge Yehia said Mr Meow-Meow had not tampered with the Opal Card in order to avoid paying the fine.
However, Judge Yehia upheld the court costs.


Robotics and AI continue to expand their potential uses either replace human capabilities or enhancing them.

A Robot Just Operated On A Human Eye for the First Time

From prostate surgery to gallbladder procedures, robots are already mainstays in the operating room. Now, they’re coming for your eyes.

In 2016, researchers from the University of Oxford’s Nuffield Department of Clinical Neurosciences kicked off a clinical trial to test the PRECEYES Surgical System, a robot designed to perform surgery on the retina, the surface at the back of the eyeball. On Monday, they published the results of their robot-assisted eye surgery trial in the journal Nature Biomedical Engineering.

A surgeon uses a joystick to control the mobile arm of the PRECEYES system. Doctors can attach various instruments to the arm, and because the system is robotic, it doesn’t suffer from any of the slight tremors that plague even the most steady-handed of humans.

A robot-assisted eye surgery did take about three times as long as a traditional one, but trial leader Robert MacLaren told New Scientist that was just because the surgeons were unfamiliar with the robot and moved slowly out of caution.

No comments:

Post a Comment