Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.) that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.
Many thanks to those who enjoy this. ☺
In the 21st Century curiosity will SKILL the cat.
Jobs are dying - Work is just beginning.
Work that engages our whole self becomes play that works.
Techne = Knowledge-as-Know-How :: Technology = Embodied Know-How
In the 21st century - the planet is the little school house in the galaxy.
Citizenship is the battlefield of the 21st Century
“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9
Content
Quotes:
Articles:
Researchers once knew little about the effects of the Little Ice outside of Europe, but no longer. It now seems, for example, that the frigid decades of the 15th century brought unseasonal frost across Mesoamerica, repeatedly ruining maize harvests in the Aztec empire. Food shortages provoked famine and, if surviving accounts can be believed, even cannibalism, weakening the empire just before the arrival of European ships and soldiers.
Across Europe and North America, the 17th century was the coldest of the Little Ice Age. Researchers have argued that, by then, the world’s great empires had grown vulnerable to even the slightest shift in environmental conditions. Populations that expanded in the warmer decades of the 16th century increasingly depended on crops grown on marginally productive farmland. Imperial governments financed ever-more expensive wars using surpluses siphoned from far-flung hinterlands. With rural areas already stretched to breaking point, temperature and precipitation extremes provoked calamitous food shortages. Famines led to widespread starvation, migration and epidemics, which in turn kindled rebellions, civil wars and conflict between states. According to the historian Geoffrey Parker, this ‘fatal synergy’ between climatic cooling, starvation, disease and conflict culminated in a ‘global crisis’ that killed perhaps a third of the world’s population.
Yet a new wave of research is revealing previously overlooked examples of creativity and adaptability even in communities that suffered most as Earth’s climate changed.
Climate change did pose severe challenges for the Dutch and, when it did, the Dutch often adapted creatively. When storms sparked a series of urban fires across Europe, for example, Dutch inventors developed and then exported new firefighting technologies and practices. When winter ice choked harbours and halted traffic on essential canals, the Dutch invented skates and refined icebreakers. Merchants set up fairs on the ice that attracted thousands from afar, and pioneered insurance policies that protected them from the risks of storms at sea.
Both Europe and the Americas now seem like hotbeds of resilience and adaptation to climate change
Different communities and even individuals within societies experienced climate change very differently, and there does not seem to have been a common, dismal fate shared by all who faced the coldest centuries of the Little Ice Age.
Most attempts to estimate the economic or geopolitical impacts of future warming therefore involve little more than educated guesswork. The future is hard to predict – perhaps harder than it ever was – and both collapse and prosperity seem possible in the century to come. So let us approach the future with open minds. Rather than resign ourselves to disaster, let us work hard to implement radical policies – such as the Green New Deal – that go beyond simply preserving what we have now, and instead promise a genuinely better world for our children.
Little Ice Age lessons
In other respects, though, economic life in parts of early colonial America resembled tribal sustenance more than brutal commercial exchange. In seventeenth-century Connecticut, for example, economic and legal relationships among free citizens were informal, governed by “book debts”—tabulated records of who owed what—as well as by what the legal historian Bruce Mann (incidentally, Elizabeth Warren’s husband) calls a “communal model” of dispute resolution. As Mann describes it in his study Neighbors and Strangers (1987), trust remained at the core of the relationship between creditors and debtors. Book debts didn’t signify formal promises to pay, but rather an understanding that payments would be made as the debtor became able. Credit was extended through personal connection, and disputes were negotiated through appeals to personal character.
Communalism does not preclude conflict, of course; disputes were common. But only after informal, interpersonal dealings had deteriorated would matters come before the courts. There, character witnesses would testify, and recorded book debts would serve only as a starting point for discussion about who owed and how much. Still, the number of debt cases heard by local courts, Mann argues, points to the importance of credit in a young, cash-poor society. As Mann went on to explore in Republic of Debtors (2002), the same was as true for the relatively wealthy as for subsistence agrarians. In the land of self-made merchants, debt was a way to get money flowing, especially when the money itself—gold and silver—was thin on the ground. At first this debt was governed by its own kind of communalism—what Graeber calls the “communism of the rich.” Among the Southern plantation and Northern merchant classes, it was gauche to demand repayment of debts—a rupture of gentlemanly agreements. Equilibrium meant most debts were eventually repaid.
The Long History of Debt Cancellation
Mainstream economists nowadays might not be particularly good at predicting financial crashes, facilitating general prosperity, or coming up with models for preventing climate change, but when it comes to establishing themselves in positions of intellectual authority, unaffected by such failings, their success is unparalleled. One would have to look at the history of religions to find anything like it. To this day, economics continues to be taught not as a story of arguments—not, like any other social science, as a welter of often warring theoretical perspectives—but rather as something more like physics, the gradual realization of universal, unimpeachable mathematical truths. “Heterodox” theories of economics do, of course, exist (institutionalist, Marxist, feminist, “Austrian,” post-Keynesian…), but their exponents have been almost completely locked out of what are considered “serious” departments, and even outright rebellions by economics students (from the post-autistic economics movement in France to post-crash economics in Britain) have largely failed to force them into the core curriculum.
Nowhere is this divide between public debate and economic reality more dramatic than in Britain, which is perhaps why it appears to be the first country where something is beginning to crack. It was center-left New Labour that presided over the pre-crash bubble, and voters’ throw-the-bastards-out reaction brought a series of Conservative governments that soon discovered that a rhetoric of austerity—the Churchillian evocation of common sacrifice for the public good—played well with the British public, allowing them to win broad popular acceptance for policies designed to pare down what little remained of the British welfare state and redistribute resources upward, toward the rich. “There is no magic money tree,” as Theresa May put it during the snap election of 2017—virtually the only memorable line from one of the most lackluster campaigns in British history. The phrase has been repeated endlessly in the media, whenever someone asks why the UK is the only country in Western Europe that charges university tuition, or whether it is really necessary to have quite so many people sleeping on the streets.
The truly extraordinary thing about May’s phrase is that it isn’t true. There are plenty of magic money trees in Britain, as there are in any developed economy. They are called “banks.” Since modern money is simply credit, banks can and do create money literally out of nothing, simply by making loans. Almost all of the money circulating in Britain at the moment is bank-created in this way. Not only is the public largely unaware of this, but a recent survey by the British research group Positive Money discovered that an astounding 85 percent of members of Parliament had no idea where money really came from (most appeared to be under the impression that it was produced by the Royal Mint).
One sign that something historically new has indeed appeared is if scholars begin reading the past in a new light. Accordingly, one of the most significant books to come out of the UK in recent years would have to be Robert Skidelsky’s Money and Government: The Past and Future of Economics.
There is a growing feeling, among those who have the responsibility of managing large economies, that the discipline of economics is no longer fit for purpose. It is beginning to look like a science designed to solve problems that no longer exist.
A good example is the obsession with inflation. Economists still teach their students that the primary economic role of government—many would insist, its only really proper economic role—is to guarantee price stability. We must be constantly vigilant over the dangers of inflation. For governments to simply print money is therefore inherently sinful. If, however, inflation is kept at bay through the coordinated action of government and central bankers, the market should find its “natural rate of unemployment,” and investors, taking advantage of clear price signals, should be able to ensure healthy growth. These assumptions came with the monetarism of the 1980s, the idea that government should restrict itself to managing the money supply, and by the 1990s had come to be accepted as such elementary common sense that pretty much all political debate had to set out from a ritual acknowledgment of the perils of government spending. This continues to be the case, despite the fact that, since the 2008 recession, central banks have been printing money frantically in an attempt to create inflation and compel the rich to do something useful with their money, and have been largely unsuccessful in both endeavors.
We now live in a different economic universe than we did before the crash. Falling unemployment no longer drives up wages. Printing money does not cause inflation. Yet the language of public debate, and the wisdom conveyed in economic textbooks, remain almost entirely unchanged.
The crux of the argument always seems to turn on the nature of money. Is money best conceived of as a physical commodity, a precious substance used to facilitate exchange, or is it better to see money primarily as a credit, a bookkeeping method or circulating IOU
The one major exception to this pattern was the mid-twentieth century, what has come to be remembered as the Keynesian age. It was a period in which those running capitalist democracies, spooked by the Russian Revolution and the prospect of the mass rebellion of their own working classes, allowed unprecedented levels of redistribution—which, in turn, led to the most generalized material prosperity in human history. The story of the Keynesian revolution of the 1930s, and the neoclassical counterrevolution of the 1970s, has been told innumerable times, but Skidelsky gives the reader a fresh sense of the underlying conflict.
Economic theory as it exists increasingly resembles a shed full of broken tools. This is not to say there are no useful insights here, but fundamentally the existing discipline is designed to solve another century’s problems. The problem of how to determine the optimal distribution of work and resources to create high levels of economic growth is simply not the same problem we are now facing: i.e., how to deal with increasing technological productivity, decreasing real demand for labor, and the effective management of care work, without also destroying the Earth. This demands a different science. The “microfoundations” of current economics are precisely what is standing in the way of this. Any new, viable science will either have to draw on the accumulated knowledge of feminism, behavioral economics, psychology, and even anthropology to come up with theories based on how people actually behave, or once again embrace the notion of emergent levels of complexity—or, most likely, both.
Intellectually, this won’t be easy. Politically, it will be even more difficult. Breaking through neoclassical economics’ lock on major institutions, and its near-theological hold over the media—not to mention all the subtle ways it has come to define our conceptions of human motivations and the horizons of human possibility—is a daunting prospect. Presumably, some kind of shock would be required. What might it take? Another 2008-style collapse? Some radical political shift in a major world government? A global youth rebellion? However it will come about, books like this—and quite possibly this book—will play a crucial part.
David Graeber - Against Economics
Every minute in 2018, Google conducted 3.88 million searches, and people watched 4.33 million videos on YouTube, sent 159,362,760 e-mails, tweeted 473,000 times and posted 49,000 photos on Instagram, according to software company Domo. By 2020 an estimated 1.7 megabytes of data will be created per second per person globally, which translates to about 418 zettabytes in a single year (418 billion one-terabyte hard drives' worth of information), assuming a world population of 7.8 billion. The magnetic or optical data-storage systems that currently hold this volume of 0s and 1s typically cannot last for more than a century, if that. Further, running data centers takes huge amounts of energy. In short, we are about to have a serious data-storage problem that will only become more severe over time.
Top 10 Emerging Technologies Of 2019
DNA DATA STORAGE IS CLOSER THAN YOU THINK
This is something that we should all support for the 21st Century citizenry - it is about time citizens were no longer held hostage to privateering of digital public infrastructure.
‘Broadband communism’? Outside the UK, public broadband is a raving success
Around the world governments are taking the lead developing digital infrastructure. Nowhere is this more evident than that well known communist enclave: the United States of America.
This week, the Labour Party announced a bold new policy proposal that has shaken up the election race – publicly owned broadband internet, free to all. In the words of Jeremy Corbyn, the party’s leader, it is “a taster of the kind of fresh, transformational policies that will change your life.”
Under the plan, the government would purchase Openreach, the digital network operator that is a subsidiary of BT Group, and form a new publicly owned British Broadband company to extend high-speed internet access to every household, business, and institution in the country. As Mat Lawrence, Director of Common Wealth, revealed in the Guardian, given than just around 7% of premises in the UK currently have such access (compared to nearly 100% in countries like Japan and South Korea), expanding broadband access would have considerable economic, social, and environmental benefits. Moreover, making the service free would reduce household and commercial bills, putting more money back in the pockets of families and entrepreneurs.
As expected, corporate lobby groups and their political lackeys were less than thrilled at the announcement. For instance, Boris Johnson, in his usual histrionic manner, derided it as a “crazed communist scheme.” The BBC was happy to regurgitate the attack line of BT’s Managing Director, Neil McRae, who derided the proposals as “broadband communism”.
However, in reality, governments around the world are taking the lead on developing the digital infrastructure necessary to develop thriving 21st century economies (just as they did with the electricity networks, roads, bridges, railroads, airports, and other vital economic infrastructure of the 20th century). They are doing so because in many cases the private sector, and specifically a shrinking group of giant for-profit telecommunications corporations, are unable and unwilling to equitably provide the necessary investment and service – leaving whole towns, regions, and socio-economic groups shut out of the modern economy and society.
This is a worthwhile signal to pay attention to.
Video surveillance footage shows how rare violence really is
Today, videos from closed-circuit television, body cameras, police dash cameras, or mobile phones are increasingly used in the social sciences. For lack of other data, researchers previously relied on people’s often vague, partial, and biased recollections to understand how violence happened. Now, video footage shows researchers second-to-second how an event unfolded, who did what, was standing where, communicating with whom, and displaying which emotions, before violence broke out or a criminal event occurred. And while we would assume such footage highlights the cruel, brutal, savage nature of humanity, looking at violence up-close actually shows the opposite. Examining footage of violent situations – from the very cameras set up because we believe that violence lurks around every corner – suggests violence is rare and commonly occurs due to confusion and helplessness, rather than anger and hate.
This is a strong signal of the inevitable emergence of national genetic census - not only for the populations of our ecologies - but of our human populations as well.
Every butterfly in the United States and Canada now has a genome sequence
Draft genomes of more than 800 species hint at the role of interbreeding in the animal’s evolution.
When evolutionary biologist Nick Grishin wanted to tackle big questions in evolution — why some branches of the tree of life are so diverse, for instance — his team set out to sequence the genomes of as many butterflies as it could: 845 of them, to be precise.
In a study that some researchers are hailing as a landmark in genomics, Grishin’s group at the University of Texas Southwestern Medical Center in Dallas sequenced and analysed the genome of what it called a “complete butterfly continent”: every species of the creature in the United States and Canada. The study was posted on the bioRxiv preprint server on 4 November1.
“I think it’s bloody amazing, because the technology involved in sequencing 845 species is there,” says James Mallet, an evolutionary biologist at Harvard University in Cambridge, Massachusetts. “It’s a beautiful piece of work, a tour de force, to do all that.”
A small signal with potentially huge impact - We may not only be domesticating DNA but at the beginning of domesticating the functions of DNA.
"It is truly exciting to consider the potential for alternate genetic systems ... that these might possibly have emerged and evolved in different environments, perhaps even on other planets or moons within our solar system," co-author Jay Goodwin, a chemist at Emory University.
DNA Just One of More Than 1 Million Possible 'Genetic Molecules,' Scientists Find
Scientists used a computer program to uncover more than 1 million molecules that could potentially store genetic information, just like DNA.
DNA and its cousin RNA store genetic information and enable life as we know it — but what if millions of lesser-known chemicals could do the exact same thing?
A new study suggests that more than 1 million chemical look-alikes could encode biological information in the same way that DNA does. The new study, published Sept. 9 in the Journal of Chemical Information and Modeling, might point the way to new targets for pharmaceutical drugs, explain how life first evolved on Earth and even help us search for life-forms beyond our planet, the authors wrote.
Both DNA and RNA, the two known types of nucleic acids, contain chemical bits called nucleotides, which link up in a particular order and relay different data, depending on their sequence, similar to individual letters within a written sentence. Some natural and man-made molecules mimic the basic structure of DNA, but before now, no one had attempted to count up how many of these look-alikes might exist, the authors wrote.
The transformation of the bio-economy as a result of AI and the domestication of DNA continues to expand the horizons of the possible.
AI for plant breeding in an ever-changing climate
How might artificial intelligence (AI) impact agriculture, the food industry, and the field of bioengineering? Dan Jacobson, a research and development staff member in the Biosciences Division at the US Department of Energy's (DOE's) Oak Ridge National Laboratory (ORNL), has a few ideas.
For the past 5 years, Jacobson and his team have studied plants to understand the genetic variables and patterns that make them adaptable to changing environments and climates. As a computational biologist, Jacobson uses some of the world's most powerful supercomputers for his work—including the recently decommissioned Cray XK7 Titan and the world's most powerful and smartest supercomputer for open science, the IBM AC922 Summit supercomputer, both located at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility at ORNL.
Last year, Jacobson and his team won an Association for Computing Machinery Gordon Bell Prize after using a special computing technique known as "mixed precision" on Summit to become the first group to reach exascale speed—approximately a quintillion calculations per second.
Jacobson's team is currently working on numerous projects that form an integrated roadmap for the future of AI in plant breeding and bioenergy. The team's work was featured in Trends in Biotechnology in October.
In this Q&A, Jacobson talks about his team's work on a genomic selection algorithm, his vision for the future of environmental genomics, and the space where simulation meets AI.
A good signal of a possible new approach to capturing solar energy.
New hybrid device can both capture and store solar energy
Researchers from the University of Houston have reported a new device that can both efficiently capture solar energy and store it until it is needed, offering promise for applications ranging from power generation to distillation and desalination.
Unlike solar panels and solar cells, which rely on photovoltaic technology for the direct generation of electricity, the hybrid device captures heat from the sun and stores it as thermal energy. It addresses some of the issues that have stalled wider-scale adoption of solar power, suggesting an avenue for using solar energy around-the-clock, despite limited sunlight hours, cloudy days and other constraints.
The work, described in a paper published Wednesday in Joule, combines molecular energy storage and latent heat storage to produce an integrated harvesting and storage device for potential 24/7 operation. The researchers report a harvesting efficiency of 73% at small-scale operation and as high as 90% at large-scale operation.
Up to 80% of stored energy was recovered at night, and the researchers said daytime recovery was even higher.
Another small signal for the eventual spread of the Internet across the globe - and maybe even further.
Google is going to deploy Loon balloons in rural Peru
Google’s project Loon announced Wednesday it had signed a new deal to use its high-altitude balloons to connect rural communities in Peru to the internet. After several tests and limited-run operations following disasters, this will be the first time Loon will attempt to deliver permanent internet access.
Under the agreement, Loon will deploy its balloons to provide 4G/LTE service beginning in 2020, focusing on the country’s Loreto region, part of the Amazon rain forest. The balloons will attempt to cover 15% of Loreto and connect up to 200,000 people, many of whom are indigenous.
A good signal towards the transformation of transportation.
The flight is a big step forward to making routine commercial drone flights a reality, said Alexander Harmsen, CEO of Iris Automation. Harmsen thinks we could see commercial drone flights using this approach within a matter of months.
Drone company Iris Automation makes first-of-its-kind FAA-approved ‘blind’ drone flight
The first FAA-approved long-distance drone flight using no ground-based radar or visual observer took place recently in Kansas.
The flight is a step toward making commercial drone flights routine across the U.S.
Iris Automation, which created the onboard drone collision-avoidance system, said routine flights may happen in a matter of months.
I think this is a very LAME signal. Because the information is available to enable the airline system to know when connecting flights will be missed and to assemble a list of options that can be presented to passengers while they are inflight. Imagine - your plane is late the connecting flight WILL be missed and the flight attendant or airline app provides you a list of options for alternative flights, buses, sleepovers - and all you have to do is choose and then you get a new itinerary
Google Flights aims to save air travelers money with new alerts on nearby airports, travel dates
Google Flights is trying to make the process of booking a flight a little easier—and less stressful—with some new updates.
Google rolled out a new feature in November for all users that notifies them when cheaper flights are available at nearby airports, Craig Ewer, a spokesperson for Google, confirmed to U.S. Today. Google also now offers a similar feature that notifies you if altering your travel dates could mean saving a significant chunk of change.
"We want people to trust that Google Flights helps you find the best flights that best fit your needs," Thijs van As, product lead for the Google Flights team, told U.S. TODAY. For example, if you're flying from New York to Washington and there is a flight available that's half the price or cheaper than what you're looking it, the site will inform you about changes to your itinerary.
Travelers don't buy tickets from Google (though they can book on the platform). The portal partners with airlines and travel agents to collect flight information. Effectively, it's a one-stop shop with insights and purchasing convenience.
This is a signal that could also transform travel and relationships including an emerging world of mixed reality entanglement.
A sensor-packed “skin” could let you cuddle your child in virtual reality
A second skin: A soft “skin” made from silicone could let the wearer feel objects in virtual reality. In a paper in Nature today, researchers from Northwestern University in the US and the Hong Kong Polytechnic University describe creating a multilayered material incorporating a chip, sensors, and actuators that let the wearer feel mechanical vibrations through the skin.
For now, the feeling of touch is transmitted by tapping areas on a touch screen, with the corresponding areas on the skin vibrating in real time in response. The skin is wireless and doesn’t require any batteries, thanks to inductive charging (the same technology used to charge smartphones wirelessly). Although virtual reality can recreate sounds and sights, the sense of touch has been underexplored, the researchers say.
The idea is that one day, these sorts of skins could let people communicate physically from a distance. For example, a parent could “hold” a child while on a virtual-reality video call from abroad. More immediately, it could also let VR video gamers feel strikes when playing, or help give users of prosthetic arms a better sense for the shape of objects they are holding. There has been plenty of hand-wringing in the tech industry over virtual reality’s failure to fulfill its full potential. Perhaps adding a sense of touch could help.
Here is a longer more detailed article with a couple of short videos.
'Epidermal VR' gives technology a human touch
Imagine holding hands with a loved one on the other side of the world. Or feeling a pat on the back from a teammate in the online game "Fortnite."
This is a great signal of a future of the co-built environment - a Canadian icon colonizing the world :)
Beavers brought in to beat flooding in Britain
Beavers are to be reintroduced in two parts of Britain as part of plans to help control flooding, the National Trust announced on Wednesday.
The charity, which manages historic properties and countryside, said it aimed to release Eurasian beavers at two sites in southern England early next year.
"The dams the beavers create will hold water in dry periods, help to lessen flash-flooding downstream and reduce erosion and improve water quality by holding silt," said Ben Eardley, project manager at one of the sites.
as "nature's engineers", whose work can help create wetland habitats to support a range of species from insects to wildfowl. Beavers in river catchment areas would "help make our landscape more resilient to climate change and the extremes of weather it will bring", Eardley said.
This is a strong signal for the need to rethink how we re-imagine and create community with appropriate new forms housing and institutional innovations to care for each other.
Looming crisis for older family carers
Increasing numbers of older family carers, some in their 80s and 90s, are providing care to support adult children with severe learning disabilities or autism, according to a significant new report published today (Wednesday 20 November 2019).
The report, "Confronting a looming crisis," from Professor Rachel Forrester-Jones at the University of Bath for New Forest Mencap—highlights severe strains placed on families as a result of our aging population. It suggests much greater support is needed to help them cope.
According to the National Institute for Health and Care Excellence (NICE), two-thirds of adults with learning disabilities and / or autism live with their families—mainly with parents. Yet, whereas in the late 1940s, life expectancy for people with conditions such as Down's syndrome was just 12 years old, today, thanks to medical advances, it has increased to 66 (+ 450 percent).
Whilst this is to be celebrated, and means that people with learning disabilities can enter the retirement phase of the life cycle, it has also created a situation in which many adult carers, some in their 80s and 90s, are now caring for adult children, themselves in their 50s and 60s.