Thursday, September 26, 2019

Friday Thinking 27 Sept 2019

Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.)  that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.

Many thanks to those who enjoy this.

In the 21st Century curiosity will SKILL the cat.

Jobs are dying - Work is just beginning.
Work that engages our whole self becomes play that works.
Techne = Knowledge-as-Know-How :: Technology = Embodied Know-How  
In the 21st century - the planet is the little school house in the galaxy.
Citizenship is the battlefield of the 21st  Century

“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9


Content
Quotes:

Articles:



it’s disquieting to think of the police using a corpse to break into someone’s digital afterlife.

Most democratic constitutions shield us from unwanted intrusions into our brains and bodies. They also enshrine our entitlement to freedom of thought and mental privacy. That’s why neurochemical drugs that interfere with cognitive functioning can’t be administered against a person’s will unless there’s a clear medical justification. Similarly, according to scholarly opinion, law-enforcement officials can’t compel someone to take a lie-detector test, because that would be an invasion of privacy and a violation of the right to remain silent.

But in the present era of ubiquitous technology, philosophers are beginning to ask whether biological anatomy really captures the entirety of who we are. Given the role they play in our lives, do our devices deserve the same protections as our brains and bodies?

Andy Clark and David Chalmers argued in ‘The Extended Mind’ (1998) that technology is actually part of us. According to traditional cognitive science, ‘thinking’ is a process of symbol manipulation or neural computation, which gets carried out by the brain. Clark and Chalmers broadly accept this computational theory of mind, but claim that tools can become seamlessly integrated into how we think. Objects such as smartphones or notepads are often just as functionally essential to our cognition as the synapses firing in our heads. They augment and extend our minds by increasing our cognitive power and freeing up internal resources.

If accepted, the extended mind thesis threatens widespread cultural assumptions about the inviolate nature of thought, which sits at the heart of most legal and social norms.

The Nobel Prize-winning physicist Richard Feynman used to say that he thought with his notebook. Without a pen and pencil, a great deal of complex reflection and analysis would never have been possible. If the extended mind view is right, then even simple technologies such as these would merit recognition and protection as a part of the essential toolkit of the mind.

Are ‘you’ just inside your skin or is your smartphone part of you?




Most of the time our perception of conscious control is an illusion. Many neuroscientific and psychological studies confirm that the brain’s ‘automatic pilot’ is usually in the driving seat, with little or no need for ‘us’ to be aware of what’s going on. Strangely, though, in these situations we retain an intense feeling that we’re in control of what we’re doing, what can be called a sense of agency. So where does this feeling come from?

...a fundamental paradox about consciousness. We have the strong impression that we choose when we do and don’t act and, as a consequence, we hold people responsible for their actions. Yet many of the ways we encounter the world don’t require any real conscious processing, and our feeling of agency can be deeply misleading.

Take the subjective experience of fluency: the easier it feels to do something, the more likely you are to think that you’re in control of the action. 

Responsibility [Response-Ability], then, is the real currency of conscious experience. In turn, it is also the bedrock of culture. Humans are social animals, but we’d be unable to cooperate or get along in communities if we couldn’t agree on the kinds of creatures we are and the sort of world we inhabit. It’s only by reflecting, sharing and accounting for our experiences that we can find such common ground. To date, the scientific method is the most advanced cognitive technology we’ve developed for honing the accuracy of our consensus – a method involving continuous experimentation, discussion and replication.

Our illusory sense of agency has a deeply important social purpose




Has the behaviour of another person ever made you feel ashamed? Not because they set out to shame you but because they acted so virtuously that it made you feel inadequate by comparison. If so, then it is likely that, at least for a brief moment in time, you felt motivated to improve as a person. Perhaps you found yourself thinking that you should be kinder, tidier, less jealous, more hardworking or just generally better: to live up to your full potential. If the feeling was powerful enough, it might have changed your behaviour for a few minutes, days, weeks, months, years or a lifetime. Such change is the result of a mechanism I shall call ‘moral hydraulics’. 

The value of shame




 it doesn’t take an Albert Einstein to observe that, without the essential first step, without a creative reimagining of nature, a conceiving of hypotheses for what might be going on behind the perceived surface of phenomena, there can be no science at all. Einstein did of course have something to say on the matter. As he told an interviewer in 1929:

I am enough of an artist to draw freely upon my imagination. Imagination is more important than knowledge. Knowledge is limited. Imagination encircles the world.

Every scientist knows this, but for two centuries they have fallen mute about it, preferring instead a safer narrative about the ‘empirical method’ or ‘the logic of scientific discovery’. Science education favours the presentation of results, and a focus on knowledge, rather than the human stories of wonder, imagination, failed ideas and those glorious and uninvited moments of illumination that thread through the lives of all who actually do science. Our media mouths the same message – I will never forget the BBC documentary on computer science in which the presenter assured viewers, face to camera, that there is no room for imagination in science. No wonder my young colleagues had become disillusioned.

The historically contemporaneous birth of the English novel and of the experimental method in science turns out to be no coincidence. Without making the naive claim that art and science are in any sense ‘doing the same thing’, the narrative similarities in the experience of those who work with them are remarkable. They need digging out because they become obscured by scientists shy of talking about imagination and artists about experiment.

Science is deeply imaginative: why is this treated as a secret?




The economy, is a network of complements and substitutes, which I will call the Economic Web. And like the biosphere, it’s evolution is substantially unprestatable, “context dependent,” and creates its own growing “context” that comprises its “Adjacent Possible.” The adjacent possible is what can arise next in this evolution. This evolution is “sucked into” the very opportunities it itself creates. Innovations into the Adjacent Possible drive this growth.

…. the evolution of the entire economic web, goods and services create novel niches which invite the innovative creation of new complementary and substitute goods such that the web as a whole grows in diversity.

The personal computer did not cause but enabled the innovation and invention of word processing, and software companies like Microsoft emerged, which was originally founded to make the operating system for IBM personal computers.
The invention of word processing and abundant files invited the possibility of file sharing and the modem was innovated and invented.
The existence of file sharing did not cause, but invited the innovation of the World Wide Web.
The existence of the Web did not cause, but enabled the innovation of selling on the Web and eBay and Amazon emerged.

 goods and services as contexts do not cause but enable the innovation, invention and introduction of the next good or service. “Enablement” is not a word used in physics. The current Actual context of goods and services invites innovation into the current adjacent possible, of the next new goods and services, typically the complements and substitutes of existing goods and services.

Innovation and The Evolution of the Economic Web




This is an important signal about the future of knowledge and information management, of science publication and more. We need a new paradigm for sharing and accessing the results of science research.
a sense of frustration permeates the scientific community: many feel that journals are a relic of the past, running an unaccountable peer-review system that has an outsized influence on the careers of researchers. Moreover, after years of above-inflation price increases and high profits, the economic sustainability of the subscription-based model of journals is increasingly being questioned.

Rise of the platforms

Journals are evolving into information platforms. This development provides a key to understanding recent trends in science publishing, and raises important questions about its future.
As much as it is a cliché to say that the volume of academic papers grows exponentially, it is nevertheless sobering to recall that this statement is, in fact, an empirical fact. A notion first popularized by the physicist turned scientometrician Derek de Solla Price, the inexorable expansion of the scientific literature has continued to accelerate into the twenty-first century. Indeed, the deluge of scholarly data now available digitally is such that the study of the practice of science itself, particularly the dynamics underpinning the process of scientific discovery and collaboration, has emerged as a quantitative discipline in its own right.

This explosion of information has resulted in a concomitant boom in the number of available journals and the financial success of their publishers. This is not how it was meant to be — at least not according to the early pioneers of the Internet. As Michael Clarke memorably put it nearly a decade ago: when Tim Berners-Lee created the Web in 1991, it was with the aim of facilitating scientific communication and the dissemination of scientific research. The extent to which the Web hasn’t disrupted the scientific publishing industry (while, at the same time, dramatically reshaping nearly all other retail services) is therefore surprising.

In physics, platforms such as quantum (https://quantum-journal.org/), researchers.one (https://www.researchers.one/) and, perhaps most impressively, SciPost (https://scipost.org/) are experimenting with innovative approaches such as transparent peer review, user comments and even doing away with the concept of accepting or rejecting papers entirely. Their editorial and business models also differ, but are predicated on the idea of being controlled by fellow scientists, open access and not for profit. 


This is an excellent podcast about the potentially emerging new economic paradigm - well worth the listen for anyone interested in an alternative to neoliberal economic pseudo-science :)

The MMT Podcast with Patricia Pino & Christian Reilly

The MMT Podcast offers economic analysis on current issues from a Modern Monetary Theory perspective. Aimed at anyone who has ever felt lost in the jargon used by mainstream economics commentators. We believe economics is for everyone


This is another vital signal of the work that is helping the development of a new economic paradigm more suited to a democratic 21st Century. This is the abstract of a 20 page paper.

Tragedy of the Commons after 50 Years

Abstract
Garrett Hardin’s “The Tragedy of the Commons” (1968) has been incredibly influential generally and within economics, and it remains important despite some historical and conceptual flaws. Hardin focused on the stress population growth inevitably placed on environmental resources. Unconstrained consumption of a shared resource—a pasture, a highway, a server—by individuals acting in rational pursuit of their self-interest can lead to congestion and worse, rapid depreciation, depletion, and even destruction of the resources. Our societies face similar problems, not only with respect to environmental resources but also with infrastructures, knowledge, and many other shared resources. In this Retrospective, we examine how the tragedy of the commons has fared within the economics literature and its relevance for economic and public policies today. We revisit the original piece to explain Hardin’s purpose and conceptual approach. We expose two conceptual mistakes he made, that of conflating resource with governance and conflating open access with commons. This critical discussion leads us to the work of Elinor Ostrom, the recent Nobel Prize in Economics Laureate, who spent her life working on commons. Finally, we discuss a few modern examples of commons governance of shared resources.
Keywords: Garrett Hardin, Tragedy of the Commons, Commons, Common Pool Resources, Vincent Ostrom, Elinor Ostrom, Open Access, Infrastructures


Another signal in the demise of neoliberal economic that supports the 40+ years of empirical work by Elinor Ostrom. The deeply entrenched premise of the atomistic, isolated, selfish individual as ubiquitous rational person in all societies is increasingly being displaced by empirical science, anthropology and modeling.
a strategy that can lead to mutual cooperation without using non-cooperative actions, even when facing an exploiter. The strategy can be described as "escape interaction if a partner defected or cooperate if a partner escaped interaction."
Yamamoto says that cooperative society can be maintained without using the action of revenge if the action of escape is possible, and this may expand the research on the evolution of cooperation.

The Prisoner's Dilemma: Exploring a strategy that leads to mutual cooperation without non-cooperative actions

A research team led by Hitoshi Yamamoto from Rissho University has analyzed which strategies would be effective in the prisoner's dilemma game, into which a new behavior of non-participation in the game was introduced. The study was carried out in collaboration with colleagues Isamu Okada (Soka University), Takuya Taguchi (Shibaura Institute of Technology), and Masayoshi Muto (Shibaura Institute of Technology). The results of the study were published in Physical Review E.

Cooperation in mutual competition is a basic mechanism for the prosperity of human society. However, the simplest model of cooperation in game theory predicts that cooperation will not emerge among rational people because cooperative behaviors incur costs to cooperators, and free riding is a better option.

The team analyzed which strategy promotes and maintains a cooperative society in a basic model of a social dilemma called the Prisoner's Dilemma by introducing a new action of non-participation in games. While previous studies could only analyze simple combinations of strategies, the research team used agent simulations and developed a method for visualizing more complex simulation results, enabling them to analyze adaptive strategies in an environment where approximately 20,000 strategies coexist and compete with each other.


We really do have to re-imagine how we architect our built environments - for walkable, healthy, social community. Collective resources and value creation is the heart of a viable evolving society. The visual is worth the view.
“This is a new way of seeing architecture,” says Matthias Kohler, a member of DFAB’s research team. The work of architects has long been presented in terms of designing inspiring building forms, while the technical specifics of construction has been relegated to the background. Kohler thinks this is quickly changing. “Suddenly how we use resources to build our habitats is at the center of architecture,” he argues. “How you build matters.”

A Swiss house built by robots promises to revolutionize the construction industry

Erecting a new building ranks among the most inefficient, polluting activities humans undertake. The construction sector is responsible for nearly 40% of the world’s total energy consumption and CO2 emissions, according to a UN global survey (pdf).

A consortium of Swiss researchers has one answer to the problem: working with robots. The proof of concept comes in the form of the DFAB House, celebrated as the first habitable building designed and planned using a choreography of digital fabrication methods.

The three-level building near Zurich features 3D-printed ceilings, energy-efficient walls, timber beams assembled by robots on site, and an intelligent home system. Developed by a team of experts at ETH Zurich university and 30 industry partners over the course of four years, the DFAB House, measuring 2,370 square feet (220 square meters), needed 60% less cement and has passed the stringent Swiss building safety codes.


This is a key signal - if it proves to be true - a phase transition in computational paradigm is near.

Google researchers have reportedly achieved “quantum supremacy”

According to a report in the Financial Times, a team of researchers from Google led by John Martinis have demonstrated quantum supremacy for the first time. This is the point at which a quantum computer is shown to be capable of performing a task that’s beyond the reach of even the most powerful conventional supercomputer. The claim appeared in a paper that was posted on a NASA website, but the publication was then taken down. Google did not respond to a request for comment from MIT Technology Review.

Why NASA? Google struck an agreement last year to use supercomputers available to NASA as benchmarks for its supremacy experiments. According to the Financial Times report, the paper said that Google’s quantum processor was able to perform a calculation in three minutes and 20 seconds that would take today’s most advanced supercomputer, known as Summit, around 10,000 years. In the paper, the researchers said that, to their knowledge, the experiment “marks the first computation that can only be performed on a quantum processor.”

 In a discussion of quantum computing at MIT Technology Review’s EmTech conference in Cambridge, Massachusetts, this week before news of Google’s paper came out, Will Oliver, an MIT professor and quantum specialist, likened the computing milestone to the first flight of the Wright brothers at Kitty Hawk in aviation. He said it would give added impetus to research in the field, which should help quantum machines achieve their promise more quickly. Their immense processing power could ultimately help researchers and companies discover new drugs and materials, create more efficient supply chains, and turbocharge AI.


This is a great signal of the evolving nature of our understanding of the complex trails that life and action leave in their wakes.

‘The Nature of Life and Death’ spotlights pollen’s role in solving crimes

Even if a criminal doesn’t leave behind fingerprints or DNA, detectives need not worry. Crime scenes are peppered with other clues — pollen and spores — that can trip up even the most careful crooks.

These clues are central to forensic ecology, in which scientists analyze biological material to help detectives solve crimes. In The Nature of Life and Death, botanist Patricia Wiltshire lays out the science underlying the discipline — which she helped pioneer in the United Kingdom — as she chronicles some of her most memorable cases of the last 20 or so years.

Early in her career, Wiltshire used the power of pollen and spores to analyze archaeological sites. The qualities that make these particles useful for studying the past also make them useful for solving crimes. The particles’ natural polymers can be long-lasting, and in certain conditions, pollen and spores persist longer than other forms of evidence, even for thousands of years.


The bio-economy promises to produce many old and new products and transform manufacturing.

Algae and bacteria team up to increase hydrogen production

In order to bring that future closer, a team from the Biochemistry and Molecular Biology Department at the University of Cordoba has been searching for ways to increase hydrogen production by using microorganisms, specifically microalgae and bacteria.

In this vein, researchers Neda Fakhimi, Alexandra Dubini and David González Ballester were able to increase hydrogen production by combining unicellular green alga called Chlamydomonas reinhardtii with Escherichia coli bacteria. The teamwork of the algae and bacteria resulted in 60% more hydrogen production than they are able to produce if algae and bacteria work separately.

The potential of the algae-bacteria combination has been proven and opens the doors to its being used in industry since the sugar added for bacteria fermentation in the lab can be transferred to waste in the real world. In other words, the relationship between algae and bacteria could use industrial waste and dirty water to produce hydrogen and decontaminate at the same time.

The combination of bioremediation (the use of microorganisms for decontamination) and hydrogen production in order to be used as a biofuel brings sustainability full circle in a society that is ever more present.


The world of our genome continues to reveal its secrets - and what was once considered ‘junk genes’ become another source of evolution.
"Transposons carry huge potential for crop improvement. They are powerful drivers of trait diversity, and while we have been harnessing these traits to improve our crops for generations, we are now starting to understand the molecular mechanisms involved," 

Harnessing tomato jumping genes could help speed-breed drought-resistant crops

Researchers from the University of Cambridge's Sainsbury Laboratory (SLCU) and Department of Plant Sciences have discovered that drought stress triggers the activity of a family of jumping genes (Rider retrotransposons) previously known to contribute to fruit shape and colour in tomatoes. Their characterisation of Rider, published today in the journal PLOS Genetics, revealed that the Rider family is also present and potentially active in other crops, highlighting its potential as a source of new trait variations that could help plants better cope with more extreme conditions driven by our changing climate.

Transposons, more commonly called jumping genes, are mobile snippets of DNA code that can copy themselves into new positions within the genome—the genetic code of an organism. They can change, disrupt or amplify genes, or have no effect at all. Discovered in corn kernels by Nobel prize-winning scientist Barbara McClintock in the 1940s, only now are scientists realising that transposons are not junk at all but actually play an important role in the evolutionary process, and in altering gene expression and the physical characteristics of plants.

Using the jumping genes already present in plants to generate new characteristics would be a significant leap forward from traditional breeding techniques, making it possible to rapidly generate new traits in crops that have traditionally been bred to produce uniform shapes, colours and sizes to make harvesting more efficient and maximise yield. They would enable production of an enormous diversity of new traits, which could then be refined and optimised by gene targeting technologies.


Another interesting signal in the possibilities of domesticating biology.

Suntanner, heal thyself: Exosome therapy may enable better repair of sun, age-damaged skin

In the future, you could be your very own fountain of youth—or at least your own skin repair reservoir. In a proof-of-concept study, researchers from North Carolina State University have shown that exosomes harvested from human skin cells are more effective at repairing sun-damaged skin cells in mice than popular retinol or stem cell-based treatments currently in use. Additionally, the nanometer-sized exosomes can be delivered to the target cells via needle-free injections.

Exosomes are tiny sacs (30—150 nanometers across) that are excreted and taken up by cells. They can transfer DNA, RNA or proteins from cell to cell, affecting the function of the recipient cell. In the regenerative medicine field, exosomes are being tested as carriers of stem cell-based treatments for diseases ranging from heart disease to respiratory disorders.


This is a fascinating signal of a whole new approach to antibiotic resistance and perhaps other forms of resistance.

Genetically engineered plasmid can be used to fight antimicrobial resistance

Researchers have engineered a plasmid to remove an antibiotic resistance gene from the Enterococcus faecalis bacterium, an accomplishment that could lead to new methods for combating antibiotic resistance. The research is published this week in Antimicrobial Agents and Chemotherapy, a journal of the American Society for Microbiology.

In vitro, and in mouse models, the engineered plasmid removed the antibiotic resistance gene from E. faecalis. In mouse models, it reduced the abundance of the resistance gene threefold..

The delivery vehicle for the engineered plasmid is a particular strain of E. faecalis, which conjugates with E. faecalis of various different strains. Conjugation is the process whereby bacteria come together to transfer genetic material from one to the other via direct cell to cell contact.


Some good news signaling hope that human caused challenges can be addressed.

2019 ozone hole could be smallest in three decades

The ozone hole over Antarctica this year could be one of the smallest seen in three decades, say scientists.
Observations of the gas's depletion high in the atmosphere demonstrate that it hasn't opened up in 2019 in the way it normally does.

The EU's Copernicus Atmosphere Monitoring Service (CAMS) says it's currently well under half the area usually seen in mid-September.
The hole is also off-centre and far from the pole, the EU agency adds.

CAMS' experts, who are based in Reading, UK, are projecting stable levels of ozone or a modest increase in the coming days.


This signals the potential for good news in carbon reductions within the next decade.

Lower carbon dioxide emissions on the horizon for cement

Concrete is the most widely used building material in the world. As a key component of concrete, cement—and more specifically its production process—is a significant contributor to climate change. Every year, over 4 billion tonnes of cement are produced, and this activity is responsible for around 8 percent of global CO2 emissions. Surprisingly, around 60 percent of the CO2 emissions associated with this production aren't released from the combustion of fossil fuels, but from the chemical reaction in the process. With this challenging problem in mind, the EU-funded LEILAC project developed a breakthrough technology with the potential to dramatically reduce the emissions of Europe's cement and lime industries.

By re-engineering the standard process flows of a traditional calciner, the LEILAC system enables pure CO2 to be captured as it's released from the limestone while the furnace combustion gases are kept separate. The innovation requires no additional chemicals and only minimal changes to the standard cement production processes, namely the replacement of the calciner.

Coordinated by Calix Limited, the project recently successfully completed its preliminary test runs at a cement plant at Lixhe, in eastern Belgium. Using the new technology on both cement and lime meal, the LEILAC team successfully demonstrated the separation of CO2 with more than 95 percent purity.

Although not yet at full design capacity, the technology shows promise as a way for the cement industry to decrease its emissions without significant energy or capital penalty. If the industry is to aid the EU in meeting its target of reducing CO2 emissions by at least 80 percent by 2050, then it will have to implement carbon capture in about two thirds of its plants.


Another strong signal of the transformation of energy geopolitics.

Australia’s capital city switches to 100% renewable energy

Canberra will be the first major region in the Southern Hemisphere to purchase all its energy from renewable sources.
The Australian capital, Canberra, will become the first city outside Europe to shift from fossil fuel to 100% renewable energy.

From 1 January 2020, Canberra will join seven other districts around the world that produces or purchase the equivalent of their total electricity consumption from renewable sources, according to a report released on 18 September by policy think tank the Australia Institute in Canberra.

The report analysed data on more than 500 regions around the world with populations greater than 100,000 people.
The district of Rhein-Hunsrück in Germany became the first area to go 100% renewable, in 2012; two German states, three states in Austria and one region in Spain followed.


Well they say the brain is a major consumer of energy.

The grandmaster diet: How to lose weight while barely moving

The 1984 World Chess Championship was called off after five months and 48 games because defending champion Anatoly Karpov had lost 22 pounds. "He looked like death," grandmaster and commentator Maurice Ashley recalls.

In 2004, winner Rustam Kasimdzhanov walked away from the six-game world championship having lost 17 pounds. In October 2018, Polar, a U.S.-based company that tracks heart rates, monitored chess players during a tournament and found that 21-year-old Russian grandmaster Mikhail Antipov had burned 560 calories in two hours of sitting and playing chess -- or roughly what Roger Federer would burn in an hour of singles tennis.

Robert Sapolsky, who studies stress in primates at Stanford University, says a chess player can burn up to 6,000 calories a day while playing in a tournament, three times what an average person consumes in a day. Based on breathing rates (which triple during competition), blood pressure (which elevates) and muscle contractions before, during and after major tournaments, Sapolsky suggests that grandmasters' stress responses to chess are on par with what elite athletes experience.


The progress in mind-computer interface continues to improve in all sorts of ways.

Brain-computer interfaces without the mess

It sounds like science fiction: controlling electronic devices with brain waves. But researchers have developed a new type of electroencephalogram (EEG) electrode that can do just that, without the sticky gel required for conventional electrodes. Even better, the devices work through a full head of hair. The researchers report the flexible electrodes, which could someday be used in brain-computer interfaces to drive cars or move artificial limbs, in the ACS journal Nano Letters.

Often used to diagnose seizure disorders and other neurological conditions, EEGs are machines that track and record brain wave patterns. To conduct an EEG, technicians typically use a very sticky gel to attach electrodes to different regions of the patient's scalp. However, this gel is difficult to wash out of hair and sometimes irritates the skin. In addition, hair interferes with the electrical signals. Ming Lei, Bo Hong, Hui Wu and colleagues wanted to develop an EEG electrode that is flexible, robust and gel-free. Such an electrode could help patients, but also might allow people to someday control devices with their brains.