Thursday, November 28, 2019

Friday Thinking 29 Nov 2019

Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.)  that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.

Many thanks to those who enjoy this.

In the 21st Century curiosity will SKILL the cat.

Jobs are dying - Work is just beginning.
Work that engages our whole self becomes play that works.
Techne = Knowledge-as-Know-How :: Technology = Embodied Know-How  
In the 21st century - the planet is the little school house in the galaxy.
Citizenship is the battlefield of the 21st  Century

“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9


Content
Quotes:

Articles:



Researchers once knew little about the effects of the Little Ice outside of Europe, but no longer. It now seems, for example, that the frigid decades of the 15th century brought unseasonal frost across Mesoamerica, repeatedly ruining maize harvests in the Aztec empire. Food shortages provoked famine and, if surviving accounts can be believed, even cannibalism, weakening the empire just before the arrival of European ships and soldiers.

Across Europe and North America, the 17th century was the coldest of the Little Ice Age. Researchers have argued that, by then, the world’s great empires had grown vulnerable to even the slightest shift in environmental conditions. Populations that expanded in the warmer decades of the 16th century increasingly depended on crops grown on marginally productive farmland. Imperial governments financed ever-more expensive wars using surpluses siphoned from far-flung hinterlands. With rural areas already stretched to breaking point, temperature and precipitation extremes provoked calamitous food shortages. Famines led to widespread starvation, migration and epidemics, which in turn kindled rebellions, civil wars and conflict between states. According to the historian Geoffrey Parker, this ‘fatal synergy’ between climatic cooling, starvation, disease and conflict culminated in a ‘global crisis’ that killed perhaps a third of the world’s population.

Yet a new wave of research is revealing previously overlooked examples of creativity and adaptability even in communities that suffered most as Earth’s climate changed. 

Climate change did pose severe challenges for the Dutch and, when it did, the Dutch often adapted creatively. When storms sparked a series of urban fires across Europe, for example, Dutch inventors developed and then exported new firefighting technologies and practices. When winter ice choked harbours and halted traffic on essential canals, the Dutch invented skates and refined icebreakers. Merchants set up fairs on the ice that attracted thousands from afar, and pioneered insurance policies that protected them from the risks of storms at sea.

Both Europe and the Americas now seem like hotbeds of resilience and adaptation to climate change

Different communities and even individuals within societies experienced climate change very differently, and there does not seem to have been a common, dismal fate shared by all who faced the coldest centuries of the Little Ice Age.

Most attempts to estimate the economic or geopolitical impacts of future warming therefore involve little more than educated guesswork. The future is hard to predict – perhaps harder than it ever was – and both collapse and prosperity seem possible in the century to come. So let us approach the future with open minds. Rather than resign ourselves to disaster, let us work hard to implement radical policies – such as the Green New Deal – that go beyond simply preserving what we have now, and instead promise a genuinely better world for our children.

Little Ice Age lessons




In other respects, though, economic life in parts of early colonial America resembled tribal sustenance more than brutal commercial exchange. In seventeenth-century Connecticut, for example, economic and legal relationships among free citizens were informal, governed by “book debts”—tabulated records of who owed what—as well as by what the legal historian Bruce Mann (incidentally, Elizabeth Warren’s husband) calls a “communal model” of dispute resolution. As Mann describes it in his study Neighbors and Strangers (1987), trust remained at the core of the relationship between creditors and debtors. Book debts didn’t signify formal promises to pay, but rather an understanding that payments would be made as the debtor became able. Credit was extended through personal connection, and disputes were negotiated through appeals to personal character.

Communalism does not preclude conflict, of course; disputes were common. But only after informal, interpersonal dealings had deteriorated would matters come before the courts. There, character witnesses would testify, and recorded book debts would serve only as a starting point for discussion about who owed and how much. Still, the number of debt cases heard by local courts, Mann argues, points to the importance of credit in a young, cash-poor society. As Mann went on to explore in Republic of Debtors (2002), the same was as true for the relatively wealthy as for subsistence agrarians. In the land of self-made merchants, debt was a way to get money flowing, especially when the money itself—gold and silver—was thin on the ground. At first this debt was governed by its own kind of communalism—what Graeber calls the “communism of the rich.” Among the Southern plantation and Northern merchant classes, it was gauche to demand repayment of debts—a rupture of gentlemanly agreements. Equilibrium meant most debts were eventually repaid.

The Long History of Debt Cancellation




Mainstream economists nowadays might not be particularly good at predicting financial crashes, facilitating general prosperity, or coming up with models for preventing climate change, but when it comes to establishing themselves in positions of intellectual authority, unaffected by such failings, their success is unparalleled. One would have to look at the history of religions to find anything like it. To this day, economics continues to be taught not as a story of arguments—not, like any other social science, as a welter of often warring theoretical perspectives—but rather as something more like physics, the gradual realization of universal, unimpeachable mathematical truths. “Heterodox” theories of economics do, of course, exist (institutionalist, Marxist, feminist, “Austrian,” post-Keynesian…), but their exponents have been almost completely locked out of what are considered “serious” departments, and even outright rebellions by economics students (from the post-autistic economics movement in France to post-crash economics in Britain) have largely failed to force them into the core curriculum.

Nowhere is this divide between public debate and economic reality more dramatic than in Britain, which is perhaps why it appears to be the first country where something is beginning to crack. It was center-left New Labour that presided over the pre-crash bubble, and voters’ throw-the-bastards-out reaction brought a series of Conservative governments that soon discovered that a rhetoric of austerity—the Churchillian evocation of common sacrifice for the public good—played well with the British public, allowing them to win broad popular acceptance for policies designed to pare down what little remained of the British welfare state and redistribute resources upward, toward the rich. “There is no magic money tree,” as Theresa May put it during the snap election of 2017—virtually the only memorable line from one of the most lackluster campaigns in British history. The phrase has been repeated endlessly in the media, whenever someone asks why the UK is the only country in Western Europe that charges university tuition, or whether it is really necessary to have quite so many people sleeping on the streets.

The truly extraordinary thing about May’s phrase is that it isn’t true. There are plenty of magic money trees in Britain, as there are in any developed economy. They are called “banks.” Since modern money is simply credit, banks can and do create money literally out of nothing, simply by making loans. Almost all of the money circulating in Britain at the moment is bank-created in this way. Not only is the public largely unaware of this, but a recent survey by the British research group Positive Money discovered that an astounding 85 percent of members of Parliament had no idea where money really came from (most appeared to be under the impression that it was produced by the Royal Mint).

One sign that something historically new has indeed appeared is if scholars begin reading the past in a new light. Accordingly, one of the most significant books to come out of the UK in recent years would have to be Robert Skidelsky’s Money and Government: The Past and Future of Economics.

There is a growing feeling, among those who have the responsibility of managing large economies, that the discipline of economics is no longer fit for purpose. It is beginning to look like a science designed to solve problems that no longer exist.

A good example is the obsession with inflation. Economists still teach their students that the primary economic role of government—many would insist, its only really proper economic role—is to guarantee price stability. We must be constantly vigilant over the dangers of inflation. For governments to simply print money is therefore inherently sinful. If, however, inflation is kept at bay through the coordinated action of government and central bankers, the market should find its “natural rate of unemployment,” and investors, taking advantage of clear price signals, should be able to ensure healthy growth. These assumptions came with the monetarism of the 1980s, the idea that government should restrict itself to managing the money supply, and by the 1990s had come to be accepted as such elementary common sense that pretty much all political debate had to set out from a ritual acknowledgment of the perils of government spending. This continues to be the case, despite the fact that, since the 2008 recession, central banks have been printing money frantically in an attempt to create inflation and compel the rich to do something useful with their money, and have been largely unsuccessful in both endeavors

We now live in a different economic universe than we did before the crash. Falling unemployment no longer drives up wages. Printing money does not cause inflation. Yet the language of public debate, and the wisdom conveyed in economic textbooks, remain almost entirely unchanged.

The crux of the argument always seems to turn on the nature of money. Is money best conceived of as a physical commodity, a precious substance used to facilitate exchange, or is it better to see money primarily as a credit, a bookkeeping method or circulating IOU

The one major exception to this pattern was the mid-twentieth century, what has come to be remembered as the Keynesian age. It was a period in which those running capitalist democracies, spooked by the Russian Revolution and the prospect of the mass rebellion of their own working classes, allowed unprecedented levels of redistribution—which, in turn, led to the most generalized material prosperity in human history. The story of the Keynesian revolution of the 1930s, and the neoclassical counterrevolution of the 1970s, has been told innumerable times, but Skidelsky gives the reader a fresh sense of the underlying conflict.

Economic theory as it exists increasingly resembles a shed full of broken tools. This is not to say there are no useful insights here, but fundamentally the existing discipline is designed to solve another century’s problems. The problem of how to determine the optimal distribution of work and resources to create high levels of economic growth is simply not the same problem we are now facing: i.e., how to deal with increasing technological productivity, decreasing real demand for labor, and the effective management of care work, without also destroying the Earth. This demands a different science. The “microfoundations” of current economics are precisely what is standing in the way of this. Any new, viable science will either have to draw on the accumulated knowledge of feminism, behavioral economics, psychology, and even anthropology to come up with theories based on how people actually behave, or once again embrace the notion of emergent levels of complexity—or, most likely, both.

Intellectually, this won’t be easy. Politically, it will be even more difficult. Breaking through neoclassical economics’ lock on major institutions, and its near-theological hold over the media—not to mention all the subtle ways it has come to define our conceptions of human motivations and the horizons of human possibility—is a daunting prospect. Presumably, some kind of shock would be required. What might it take? Another 2008-style collapse? Some radical political shift in a major world government? A global youth rebellion? However it will come about, books like this—and quite possibly this book—will play a crucial part.

David Graeber - Against Economics





Every minute in 2018, Google conducted 3.88 million searches, and people watched 4.33 million videos on YouTube, sent 159,362,760 e-mails, tweeted 473,000 times and posted 49,000 photos on Instagram, according to software company Domo. By 2020 an estimated 1.7 megabytes of data will be created per second per person globally, which translates to about 418 zettabytes in a single year (418 billion one-terabyte hard drives' worth of information), assuming a world population of 7.8 billion. The magnetic or optical data-storage systems that currently hold this volume of 0s and 1s typically cannot last for more than a century, if that. Further, running data centers takes huge amounts of energy. In short, we are about to have a serious data-storage problem that will only become more severe over time.

Top 10 Emerging Technologies Of 2019

DNA DATA STORAGE IS CLOSER THAN YOU THINK





This is something that we should all support for the 21st Century citizenry - it is about time citizens were no longer held hostage to privateering of digital public infrastructure.

‘Broadband communism’? Outside the UK, public broadband is a raving success

Around the world governments are taking the lead developing digital infrastructure. Nowhere is this more evident than that well known communist enclave: the United States of America.
This week, the Labour Party announced a bold new policy proposal that has shaken up the election race – publicly owned broadband internet, free to all. In the words of Jeremy Corbyn, the party’s leader, it is “a taster of the kind of fresh, transformational policies that will change your life.”

Under the plan, the government would purchase Openreach, the digital network operator that is a subsidiary of BT Group, and form a new publicly owned British Broadband company to extend high-speed internet access to every household, business, and institution in the country. As Mat Lawrence, Director of Common Wealth, revealed in the Guardian, given than just around 7% of premises in the UK currently have such access (compared to nearly 100% in countries like Japan and South Korea), expanding broadband access would have considerable economic, social, and environmental benefits. Moreover, making the service free would reduce household and commercial bills, putting more money back in the pockets of families and entrepreneurs.

As expected, corporate lobby groups and their political lackeys were less than thrilled at the announcement. For instance, Boris Johnson, in his usual histrionic manner, derided it as a “crazed communist scheme.” The BBC was happy to regurgitate the attack line of BT’s Managing Director, Neil McRae, who derided the proposals as “broadband communism”.

However, in reality, governments around the world are taking the lead on developing the digital infrastructure necessary to develop thriving 21st century economies (just as they did with the electricity networks, roads, bridges, railroads, airports, and other vital economic infrastructure of the 20th century). They are doing so because in many cases the private sector, and specifically a shrinking group of giant for-profit telecommunications corporations, are unable and unwilling to equitably provide the necessary investment and service – leaving whole towns, regions, and socio-economic groups shut out of the modern economy and society.


This is a worthwhile signal to pay attention to.

Video surveillance footage shows how rare violence really is

Today, videos from closed-circuit television, body cameras, police dash cameras, or mobile phones are increasingly used in the social sciences. For lack of other data, researchers previously relied on people’s often vague, partial, and biased recollections to understand how violence happened. Now, video footage shows researchers second-to-second how an event unfolded, who did what, was standing where, communicating with whom, and displaying which emotions, before violence broke out or a criminal event occurred. And while we would assume such footage highlights the cruel, brutal, savage nature of humanity, looking at violence up-close actually shows the opposite. Examining footage of violent situations – from the very cameras set up because we believe that violence lurks around every corner – suggests violence is rare and commonly occurs due to confusion and helplessness, rather than anger and hate.


This is a strong signal of the inevitable emergence of national genetic census - not only for the populations of our ecologies - but of our human populations as well.

Every butterfly in the United States and Canada now has a genome sequence

Draft genomes of more than 800 species hint at the role of interbreeding in the animal’s evolution.
When evolutionary biologist Nick Grishin wanted to tackle big questions in evolution — why some branches of the tree of life are so diverse, for instance — his team set out to sequence the genomes of as many butterflies as it could: 845 of them, to be precise.

In a study that some researchers are hailing as a landmark in genomics, Grishin’s group at the University of Texas Southwestern Medical Center in Dallas sequenced and analysed the genome of what it called a “complete butterfly continent”: every species of the creature in the United States and Canada. The study was posted on the bioRxiv preprint server on 4 November1.

“I think it’s bloody amazing, because the technology involved in sequencing 845 species is there,” says James Mallet, an evolutionary biologist at Harvard University in Cambridge, Massachusetts. “It’s a beautiful piece of work, a tour de force, to do all that.”


A small signal with potentially huge impact - We may not only be domesticating DNA but at the beginning of domesticating the functions of DNA.
"It is truly exciting to consider the potential for alternate genetic systems ... that these might possibly have emerged and evolved in different environments, perhaps even on other planets or moons within our solar system," co-author Jay Goodwin, a chemist at Emory University. 

DNA Just One of More Than 1 Million Possible 'Genetic Molecules,' Scientists Find

Scientists used a computer program to uncover more than 1 million molecules that could potentially store genetic information, just like DNA.
DNA and its cousin RNA store genetic information and enable life as we know it — but what if millions of lesser-known chemicals could do the exact same thing? 

A new study suggests that more than 1 million chemical look-alikes could encode biological information in the same way that DNA does. The new study, published Sept. 9 in the Journal of Chemical Information and Modeling, might point the way to new targets for pharmaceutical drugs, explain how life first evolved on Earth and even help us search for life-forms beyond our planet, the authors wrote. 

Both DNA and RNA, the two known types of nucleic acids, contain chemical bits called nucleotides, which link up in a particular order and relay different data, depending on their sequence, similar to individual letters within a written sentence. Some natural and man-made molecules mimic the basic structure of DNA, but before now, no one had attempted to count up how many of these look-alikes might exist, the authors wrote. 


The transformation of the bio-economy as a result of AI and the domestication of DNA continues to expand the horizons of the possible.

AI for plant breeding in an ever-changing climate

How might artificial intelligence (AI) impact agriculture, the food industry, and the field of bioengineering? Dan Jacobson, a research and development staff member in the Biosciences Division at the US Department of Energy's (DOE's) Oak Ridge National Laboratory (ORNL), has a few ideas.
For the past 5 years, Jacobson and his team have studied plants to understand the genetic variables and patterns that make them adaptable to changing environments and climates. As a computational biologist, Jacobson uses some of the world's most powerful supercomputers for his work—including the recently decommissioned Cray XK7 Titan and the world's most powerful and smartest supercomputer for open science, the IBM AC922 Summit supercomputer, both located at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility at ORNL.

Last year, Jacobson and his team won an Association for Computing Machinery Gordon Bell Prize after using a special computing technique known as "mixed precision" on Summit to become the first group to reach exascale speed—approximately a quintillion calculations per second.

Jacobson's team is currently working on numerous projects that form an integrated roadmap for the future of AI in plant breeding and bioenergy. The team's work was featured in Trends in Biotechnology in October.

In this Q&A, Jacobson talks about his team's work on a genomic selection algorithm, his vision for the future of environmental genomics, and the space where simulation meets AI.


A good signal of a possible new approach to capturing solar energy.

New hybrid device can both capture and store solar energy

Researchers from the University of Houston have reported a new device that can both efficiently capture solar energy and store it until it is needed, offering promise for applications ranging from power generation to distillation and desalination.
Unlike solar panels and solar cells, which rely on photovoltaic technology for the direct generation of electricity, the hybrid device captures heat from the sun and stores it as thermal energy. It addresses some of the issues that have stalled wider-scale adoption of solar power, suggesting an avenue for using solar energy around-the-clock, despite limited sunlight hours, cloudy days and other constraints.

The work, described in a paper published Wednesday in Joule, combines molecular energy storage and latent heat storage to produce an integrated harvesting and storage device for potential 24/7 operation. The researchers report a harvesting efficiency of 73% at small-scale operation and as high as 90% at large-scale operation.

Up to 80% of stored energy was recovered at night, and the researchers said daytime recovery was even higher.


Another small signal for the eventual spread of the Internet across the globe - and maybe even further.

Google is going to deploy Loon balloons in rural Peru

Google’s project Loon announced Wednesday it had signed a new deal to use its high-altitude balloons to connect rural communities in Peru to the internet. After several tests and limited-run operations following disasters, this will be the first time Loon will attempt to deliver permanent internet access.

Under the agreement, Loon will deploy its balloons to provide 4G/LTE service beginning in 2020, focusing on the country’s Loreto region, part of the Amazon rain forest. The balloons will attempt to cover 15% of Loreto and connect up to 200,000 people, many of whom are indigenous.


A good signal towards the transformation of transportation.
The flight is a big step forward to making routine commercial drone flights a reality, said Alexander Harmsen, CEO of Iris Automation. Harmsen thinks we could see commercial drone flights using this approach within a matter of months.

Drone company Iris Automation makes first-of-its-kind FAA-approved ‘blind’ drone flight

The first FAA-approved long-distance drone flight using no ground-based radar or visual observer took place recently in Kansas.

The flight is a step toward making commercial drone flights routine across the U.S.

Iris Automation, which created the onboard drone collision-avoidance system, said routine flights may happen in a matter of months.


I think this is a very LAME signal. Because the information is available to enable the airline system to know when connecting flights will be missed and to assemble a list of options that can be presented to passengers while they are inflight. Imagine - your plane is late the connecting flight WILL be missed and the flight attendant or airline app provides you a list of options for alternative flights, buses, sleepovers - and all you have to do is choose and then you get a new itinerary 

Google Flights aims to save air travelers money with new alerts on nearby airports, travel dates

Google Flights is trying to make the process of booking a flight a little easier—and less stressful—with some new updates.
Google rolled out a new feature in November for all users that notifies them when cheaper flights are available at nearby airports, Craig Ewer, a spokesperson for Google, confirmed to U.S. Today. Google also now offers a similar feature that notifies you if altering your travel dates could mean saving a significant chunk of change.

"We want people to trust that Google Flights helps you find the best flights that best fit your needs," Thijs van As, product lead for the Google Flights team, told U.S. TODAY. For example, if you're flying from New York to Washington and there is a flight available that's half the price or cheaper than what you're looking it, the site will inform you about changes to your itinerary.

Travelers don't buy tickets from Google (though they can book on the platform). The portal partners with airlines and travel agents to collect flight information. Effectively, it's a one-stop shop with insights and purchasing convenience.


This is a signal that could also transform travel and relationships including an emerging world of mixed reality entanglement.

A sensor-packed “skin” could let you cuddle your child in virtual reality

A second skin: A soft “skin” made from silicone could let the wearer feel objects in virtual reality. In a paper in Nature today, researchers from Northwestern University in the US and the Hong Kong Polytechnic University describe creating a multilayered material incorporating a chip, sensors, and actuators that let the wearer feel mechanical vibrations through the skin.

For now, the feeling of touch is transmitted by tapping areas on a touch screen, with the corresponding areas on the skin vibrating in real time in response. The skin is wireless and doesn’t require any batteries, thanks to inductive charging (the same technology used to charge smartphones wirelessly). Although virtual reality can recreate sounds and sights, the sense of touch has been underexplored, the researchers say.

The idea is that one day, these sorts of skins could let people communicate physically from a distance. For example, a parent could “hold” a child while on a virtual-reality video call from abroad. More immediately, it could also let VR video gamers feel strikes when playing, or help give users of prosthetic arms a better sense for the shape of objects they are holding. There has been plenty of hand-wringing in the tech industry over virtual reality’s failure to fulfill its full potential. Perhaps adding a sense of touch could help. 

Here is a longer more detailed article with a couple of short videos.

'Epidermal VR' gives technology a human touch

Imagine holding hands with a loved one on the other side of the world. Or feeling a pat on the back from a teammate in the online game "Fortnite."


This is a great signal of a future of the co-built environment - a Canadian icon colonizing the world :)

Beavers brought in to beat flooding in Britain

Beavers are to be reintroduced in two parts of Britain as part of plans to help control flooding, the National Trust announced on Wednesday.
The charity, which manages historic properties and countryside, said it aimed to release Eurasian beavers at two sites in southern England early next year.

"The dams the beavers create will hold water in dry periods, help to lessen flash-flooding downstream and reduce erosion and improve water quality by holding silt," said Ben Eardley, project manager at one of the sites.

 as "nature's engineers", whose work can help create wetland habitats to support a range of species from insects to wildfowl. Beavers in river catchment areas would "help make our landscape more resilient to climate change and the extremes of weather it will bring", Eardley said.


This is a strong signal for the need to rethink how we re-imagine and create community with appropriate new forms housing and institutional innovations to care for each other.

Looming crisis for older family carers

Increasing numbers of older family carers, some in their 80s and 90s, are providing care to support adult children with severe learning disabilities or autism, according to a significant new report published today (Wednesday 20 November 2019).

The report, "Confronting a looming crisis," from Professor Rachel Forrester-Jones at the University of Bath for New Forest Mencap—highlights severe strains placed on families as a result of our aging population. It suggests much greater support is needed to help them cope.

According to the National Institute for Health and Care Excellence (NICE), two-thirds of adults with learning disabilities and / or autism live with their families—mainly with parents. Yet, whereas in the late 1940s, life expectancy for people with conditions such as Down's syndrome was just 12 years old, today, thanks to medical advances, it has increased to 66 (+ 450 percent).

Whilst this is to be celebrated, and means that people with learning disabilities can enter the retirement phase of the life cycle, it has also created a situation in which many adult carers, some in their 80s and 90s, are now caring for adult children, themselves in their 50s and 60s.

Thursday, November 21, 2019

Friday Thinking 22 Nov 2019

Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.)  that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.

Many thanks to those who enjoy this.

In the 21st Century curiosity will SKILL the cat.

Jobs are dying - Work is just beginning.
Work that engages our whole self becomes play that works.
Techne = Knowledge-as-Know-How :: Technology = Embodied Know-How  
In the 21st century - the planet is the little school house in the galaxy.
Citizenship is the battlefield of the 21st  Century

“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9

Content
Quotes:

Articles:




People who trust the media more are more knowledgeable about politics and the news. The more people trust science, the more scientifically literate they are. Even if this evidence remains correlational, it makes sense that people who trust more should get better at figuring out whom to trust. In trust as in everything else, practice makes perfect.

But then, the puzzle only deepens: if trusting provides such learning opportunities, we should trust too much, rather than not enough. Ironically, the very reason why we should trust more – the fact that we gain more information from trusting than from not trusting – might make us inclined to trust less.

When our trust is disappointed – when we trust someone we shouldn’t have – the costs are salient, and our reaction ranges from annoyance all the way to fury and despair. The benefit – what we’ve learnt from our mistake – is easy to overlook. By contrast, the costs of not trusting someone we could have trusted are, as a rule, all but invisible. We don’t know about the friendship we could have struck (if we’d let that acquaintance crash at our place). We don’t realise how useful some advice would have been (had we used our colleague’s tip about the new software application).

We should consider these hidden costs and benefits: think of what we learn by trusting, the people whom we can befriend, the knowledge that we can gain.

Giving people a chance isn’t only the moral thing to do. It’s also the smart thing to do.

The smart move: we learn more by trusting than by not trusting




Epicurus emphasised the pleasures of learning and speculating about nature and the social world, and Lucretius pointed out that what was exceptional about human beings was their creativity and handiwork. People enjoy figuring things out and getting things to work, or just getting things to look, sound and taste better, for themselves and for others. Real enjoyment arises from activities that activate concentration, that require practice and skill, and that deliver sensory enjoyment. The ability of our hands to manipulate small objects with speed and precision is unique to humans. Together with the appreciation of beauty in colour and form, this endowment adds the arts to the sciences, as the best that humans can do.

One of the tragedies of life in civilisation is that most human work doesn’t require or develop human ingenuity and artistry. Nevertheless, every human being who is not living in conditions of total cultural deprivation can activate them. The traditional pastimes of childhood were activities carried out for their own sake: crafts and puzzles, reading about animals, history, far-off places and the future, exploring the outdoors, and helping adults and younger children. Their adult equivalents are found in kitchens, sewing rooms, garages and workshops, along with libraries and lecture rooms. Making things such as pottery, jewellery, knitted, embroidered and stitched items, and fixing things around the house is a profound source of human satisfaction. In these activities, hands, eyes and mind are engaged with the material world, and it is your own taste and judgment that determine the outcome.

How to be an Epicurean




Many people have jobs that are not meaningful to them, jobs that are pursued mostly for money. Because of the common disconnect of the industrial worker from the material product of her effort, it is understandable that the focus of work easily shifts to monetary compensation.

Earnings become a symbol and cause of a successful working life. The workers’ value becomes tied to their value as measured by their financial rewards.

….the industrial experience and ethics where there is a clear hierarchy between the higher value of the desired end and the lower value of the means to get that end. In extractive, mass-industrial settings, if the result was financially desirable, then any method, even ethically questionable ones, were justified to achieve it. The end justified the means. Mindful work sees things differently. In the post-industrial, post-fossil world, we have to reconstruct the relation of means to ends. This is partly because the negative externalities that businesses create are going to be factored in the total net impact calculation of value created and/or destroyed.

Mindful and mindless work




imagine you are hiking in a valley. To reach the next valley, you need to climb a large mountain, which requires a lot of work. Now, imagine that you could tunnel through the mountain to get to the next valley, with no real effort required. This is what quantum mechanics allows, under certain conditions. In fact, if the two valleys have exactly the same shape, you would be simultaneously located in both valleys.

Chemists observe 'spooky' quantum tunneling





“You know, what people don’t understand is that AI never fires anybody. Robots don’t take your job. It’s your manager who fires you. It’s the CEO who approves an AI that eats jobs. All AI does is expedite the expeditable. Human damage and market self-interest? Totally different story — but that’s where the action really is.”

“Whatever AI can do, and whatever you can’t do, will boil down to what and why and how we choose to digitally articulate the care and the intimacy of the relationships we already have. It’s a choice.”

For my money, my colleague named the threat implicit in both The Robots Are Coming! and Inhuman Power. AI’s gravest problems aren’t technical or ethical; they’re metaphysical. As the cartoon character Pogo said decades ago, we have met the enemy — invented it, actually, if the enemy AI be — and he is us.

Intelligence Test - Anticipating an artificial world




When the immunologist De’Broski Herbert at the University of Pennsylvania looked deep inside the lungs of mice infected with influenza, he thought he was seeing things. He had found a strange-looking cell with a distinctive thatch of projections like dreadlocks atop a pear-shaped body, and it was studded with taste receptors. He recalled that it looked just like a tuft cell — a cell type most often associated with the lining of the intestines.

But what would a cell covered with taste receptors be doing in the lungs? And why did it only appear there in response to a severe bout of influenza?

One of life’s fundamental challenges is to find food that’s good to eat and avoid food that isn’t. Outside of our modern world of prepackaged food on grocery store shelves, it’s a perilous task. Taking advantage of a new type of food could mean the difference between starvation and survival, or it could mean an early death from accidental self-poisoning. Chemosensory receptors help us make this distinction. They’re so essential that even single-celled bacteria such as Escherichia coli carry a type of this receptor. 

One of life’s fundamental challenges is to find food that’s good to eat and avoid food that isn’t. Outside of our modern world of prepackaged food on grocery store shelves, it’s a perilous task. Taking advantage of a new type of food could mean the difference between starvation and survival, or it could mean an early death from accidental self-poisoning. Chemosensory receptors help us make this distinction. They’re so essential that even single-celled bacteria such as Escherichia coli carry a type of this receptor.
#MetaphorsOfSelf 

Cells That ‘Taste’ Danger Set Off Immune Responses





Here is a signal that speaks to the reality of governments ‘printing money’ -that Modern Monetary Theory is highlighting as a fundamental insight for a new economic paradigm. Governments always print money - the issue is who are they printing it for?

The Federal Reserve is in stealth intervention mode

What the central bank passes off as ‘funding issues’ could more accurately be described as liquidity injections to keep interest rates low
The Federal Reserve has gone into full intervention mode.
Actually, accelerated intervention mode. Not just a “mid-cycle adjustment,” as Fed Chairman Jerome Powell said in July, but interventions to the tune of tens of billions of dollars every day.

What’s the crisis, you ask? After all, we live in an age of trillion-dollar market-cap companies and unemployment at 50-year lows. Yet the Fed is acting like the doomsday clock has melted as a result of a nuclear attack.
Think I’m in hyperbole mode? Far from it.

Unless you think the biggest repurchase (repo) efforts ever — surpassing the 2008 financial-crisis actions — are hyperbole


This is a good signal of the future of pharmaceuticals - and more. Closet Gene/DNA labs filled with used equipment bought on eBay already exist.

Who shrank the drug factory? Briefcase-sized labs could transform medicine

Engineers are miniaturizing pharmaceutical production in the hope of making it portable and inexpensive.
Govind Rao greets visitors to his lab just outside Baltimore with two things: a warm handshake and a chart. Almost before introductions are complete, Rao ushers guests into his windowless office at the University of Maryland, Baltimore County (UMBC), and pulls up a graph on his battered laptop. On it, a steeply sloping line charts health spending in the United States over the past 40 years.

“It just goes up and up. How many lives is this costing?” he asks.

Rao’s solution sits in a sleek, stainless-steel briefcase on a table across from his desk. He pops the latch and flips open the lid to reveal a series of interconnected, fist-sized black boxes. They are filled with vials the size of a paper clip, fed by syringes and joined by clear plastic tubes not much thicker than a human hair. Add a source of electricity, some freeze-dried cell parts and a pinch of DNA, and the portable devices allow anyone to start making sophisticated drugs for just a few dollars. The system is called Bio-MOD, or Biologic Medications on Demand, and Rao says that it has the potential to change the direction of the precipitous curve on his laptop.

Rao isn’t alone. Teams at the Massachusetts Institute of Technology (MIT) in Cambridge, Virginia Commonwealth University (VCU) in Richmond, and hospitals in Latin America and Europe have all been experimenting with producing on-demand pharmaceuticals. Their prototype systems represent a complete reinvention of drug manufacturing.


This article has some great signals summarizing some of the emerging transformation of food production - that also transforms the carrying capacity of earth by implementing a new technological framework.

These companies are leading an alternative proteins revolution

- The alternative protein market is expanding rapidly, and could reach $140 billion within the next 10 years
- Numerous companies are now entering the market, hoping to take advantage of increased consumer interest in sustainable food production
- Critics have accused 'fake meat' of being overly processed and full of chemicals

It may sound like science fiction, but in a few short years the family dinner table may be laden with steak from a printer and other proteins produced from air, methane or volcanic microbes.

With the explosive success of vegan beef and burger substitutes developed by Beyond Meat and Impossible Foods, the alternative protein sector just keeps growing.

According to investment bank Barclays, alternative meat sales could reach $140 billion - or 10% of the global meat industry - within a decade, or a 10-fold increase from current levels.


This is a great signal of progress in understanding biology and how cancer arises.

Researchers unlock cancer cells' feeding mechanism, central to tumor growth

The findings could lead to new treatments by blocking tumor growth at its roots
An international team led by researchers has discovered the energy production mechanism of cancerous cells that drives the growth of the nucleolus and causes tumors to rapidly multiply. The findings could lead to the development of new cancer treatments that would stop tumor growth by cutting the energy supply to the nucleolus.

The findings, published Aug. 1 in the journal Nature Cell Biology, could lead to the development of new cancer treatments that would stop tumor growth by cutting the energy supply to the nucleolus.

"The nucleolus is the 'eye' of the cancer storm that ravages patients' bodies. Being able to control the eye would be a true game-changer in cancer treatment," said Atsuo Sasaki, PhD, associate professor at the UC College of Medicine and one of the research team's lead investigators.


This is an important signal on many levels - the concept of homeostasis depends not only on ubiquitous sensors but also how they can contribute to a ‘self-regulatory-index-of-status’. Biology provides foundational enactions of self-other regulation. The core of tasting what is self - or - not good-for-self. There is a good 7 min video.
Researchers around the world are tracing the ancient evolutionary roots that olfactory and taste receptors (collectively called chemosensory receptors or nutrient receptors) share with the immune system. A flurry of work in recent years shows that their paths cross far more often than anyone anticipated, and that this chemosensory-immunological network plays a role not just in infection, but in cancer and at least a handful of other diseases.

Cells That ‘Taste’ Danger Set Off Immune Responses

Taste and smell receptors in unexpected organs monitor the state of the body’s natural microbial health and raise an alarm over invading parasites.
When the immunologist De’Broski Herbert at the University of Pennsylvania looked deep inside the lungs of mice infected with influenza, he thought he was seeing things. He had found a strange-looking cell with a distinctive thatch of projections like dreadlocks atop a pear-shaped body, and it was studded with taste receptors. He recalled that it looked just like a tuft cell — a cell type most often associated with the lining of the intestines.

But what would a cell covered with taste receptors be doing in the lungs? And why did it only appear there in response to a severe bout of influenza?

Herbert wasn’t alone in his puzzlement over this mysterious and little-studied group of cells that keep turning up in unexpected places, from the thymus (a small gland in the chest where pathogen-fighting T cells mature) to the pancreas. Scientists are only just beginning to understand them, but it is gradually becoming clear that tuft cells are an important hub for the body’s defenses precisely because they can communicate with the immune system and other sets of tissues, and because their taste receptors allow them to identify threats that are still invisible to other immune cells.


And this is a very fascinating signal related to a new medical paradigm.
 “In theory you can go after almost anything. Poisons, pathogens, viruses bacteria, anything that we can specifically bind to we can remove. So it’s a very powerful potential tool," Frodsham told The Telegraph. “When someone has a tumour you cut it out, Blood cancer is a tumour in the blood, so why not just take it out in the same way? Now we know it’s possible, it’s just a question of figuring out some of the details.” 

Magnetic Tool That Removes Diseases From the Blood Set For Human Trials

A magnetic tool that can pull leukemia and malaria out of the blood is expected to start trials next year.
A UK biochemist George Frodsham has reportedly discovered a way to remove disease-causing microbes from the blood using magnets and now his company is gearing up to start human trials. 

Studying magnetic nanoparticles and how they bind to cells in the body gave Frodsham the idea a few years back to use the same principles to extract viruses including leukemia, Sepsis, and malaria from the blood. 

The idea is that any virus,  blood cancer or infection can be removed from the body thanks to tiny magnetics, removing the need for medication or treatments like chemotherapy.  

His idea turned into MediSieve, which was spun out from the University College London in 2015 to develop and commercialize magnetic blood filtration. It has raised £2.1M in equity funding and grants totaling £2.M.  Now the first human trials of its technology, which is named after the company, is expected to start in 2020.

The MediSieve system was developed to integrate with existing hospital pumps, connecting to user-supplied blood lines that interface with cannulas or catheters used for venous access. The MediSieve Filter, the heart of the system, is a single-use, disposable magnetic filter which captures and retains the magnetic components. The company says a patient's total blood volume can be filtered in under an hour. 


This will be oldish news - but still provides a powerful signal for the future of human health and the domestication of DNA.

Implantable artificial kidney achieves preclinical milestone

The Kidney Project, a national effort to develop an implantable bio-artificial kidney that could eliminate the need for dialysis, will announce a key milestone in a November 7, 2019 presentation at the American Society of Nephrology Kidney Week 2019 conference in Washington, DC.

The team will report that UC San Francisco scientists have successfully implanted a prototype kidney bioreactor containing functional human kidney cells into pigs without significant safety concerns. The device, which is about the size of a deck of cards, did not trigger an immune reaction or cause blood clots in the animals, an important milestone on the road to future human trials.

"This is the first demonstration that kidney cells can be implanted successfully in a large animal without immunosuppression and remain healthy enough to perform their function. This is a key milestone for us," said Kidney Project co-lead Shuvo Roy, Ph.D., a faculty member in the Department of Bioengineering and Therapeutic Sciences, a joint department of the UCSF Schools of Pharmacy and Medicine. "Based on these results, we can now focus on scaling up the bioreactor and combining it with the blood filtration component of the artificial kidney."


This is a good signal of the slow process of exploring the brain-computer-world interface. From restoration to augmentation maybe a much smaller step.
"I still can't put it into words. I mean, from being able to see absolutely nothing, it's pitch black. To all of a sudden seeing little flickers of light move around," Esterhuizen trailed off in awe as he spoke to UCLA Health.

A Brain Implant Gave This Blind Man Some Sense of Sight Again

The device, called the Orion, was implanted over the visual cortex in Esterhuizen’s brain. It converts images from a tiny video camera on a pair of sunglasses into a series of electrical pulses. The pulses stimulate electrodes in Esterhuizen’s brain, allowing him to see patterns of light, which act as visual cues.

The glasses also come with a belt that includes a button that can be pushed in order to amplify dark objects in the sun and light objects in the dark.

Currently, the implant stimulates the left side of the patient’s brain, so they can only perceive visual cues from their right field of vision. The ultimate goal is for the implant to work on both sides of the brain for a full field of vision.


This is a first hint of a possible future well known in Science Fiction.

Humans placed in suspended animation for the first time

Doctors have placed humans in suspended animation for the first time, as part of a trial in the US that aims to make it possible to fix traumatic injuries that would otherwise cause death.

Samuel Tisherman, at the University of Maryland School of Medicine, told New Scientist that his team of medics had placed at least one patient in suspended animation, calling it “a little surreal” when they first did it. He wouldn’t reveal how many people had survived as a result.

The technique, officially called emergency preservation and resuscitation (EPR), is being carried out on people who arrive at the University of Maryland Medical Centre in Baltimore with an acute trauma – such as a gunshot or stab wound – and have had a cardiac arrest. Their heart will have stopped beating and they will have lost more than half their blood. There are only minutes to operate, with a less than 5 per cent chance that they would normally survive.

EPR involves rapidly cooling a person to around 10 to 15°C by replacing all of their blood with ice-cold saline. The patient’s brain activity almost completely stops. They are then disconnected from the cooling system and their body – which would otherwise be classified as dead – is moved to the operating theatre.

A surgical team then has 2 hours to fix the person’s injuries before they are warmed up and their heart restarted. Tisherman says he hopes to be able to announce the full results of the trial by the end of 2020.


I think this is a weak but awesome signal of a potential future of a metabolic economy - where everything produced can be ‘metabolized’ into other uses when current uses end.

Scientists develop industrial-strength adhesive which can be unstuck in magnetic field

Researchers at the University of Sussex have developed a glue which can unstick when placed in a magnetic field, meaning products otherwise destined for landfill, could now be dismantled and recycled at the end of their life.
Currently, items like mobile phones, microwaves and car dashboards are assembled using adhesives. It is a quick and relatively cheap way to make products but, due to problems dismantling the various materials for different recycling methods, most of these products will be destined for landfill.

However, Dr. Barnaby Greenland, Lecturer in Medicinal Chemistry, working in conjunction with Stanelco RF Technologies Ltd and Prof Wayne Hayes at the University of Reading, may have found a solution.

In a new research paper, published by the European Polymer Journal, Dr. Greenland and the team describe a new type of adhesive which contains tiny particles of metal. When passed through an alternating electromagnetic field, the glue melts and products simply fall apart.

The adhesive works with plastic, wood, glass and metal and in terms of strength, is comparable to those currently used in industry.
Dr. Greenland said: "In as little as 30 seconds, we can unstick items using a relatively weak magnetic field.


A great signal of the continuing improvements of generating renewable power.

Superconducting wind turbine chalks up first test success

The EcoSwing consortium designed, developed, and manufactured a full-size superconducting generator for a 3.6 megawatt wind turbine, and field-tested it in Thyborøn, Denmark.
They report their results in the IOP Publishing journal Superconductor Science and Technology.

Corresponding author Anne Bergen, from the University of Twente, The Netherlands, said: "Wind turbine size has grown significantly over the last few decades. However, today's technology has trouble keeping up with the trend towards ever-increasing unit power levels.

the team employed rare-earth barium copper oxide (ReBCO) high-temperature superconducting generators. These require a smaller amount of rare-earth materials than PM machines, resulting in a lower cost. Superconductors can also carry high current densities, which results in more power-dense coils and a lower weight.


I wonder if any great poet has ever written an ODE to the Toilet? Surely this is a device that has transformed the world and enable great increases in health, wellbeing, comfort and convenience. But it can be improved.
“Our coating can be applied by simply spray-coating or wiping directly onto the surface, and it is very easy to apply,” “Household users can apply the coating by themselves.”

This slippery new coating could make toilets less filthy

The sprayable coating could also help reduce wastewater
A slippery new coating could make the crappiest place in your home a little cleaner. Developed by researchers at Penn State University, this two-part product promises to keep your toilet bowl clean, stink-free, and — potentially — set the stage for toilets to use less water in the future.

Worldwide, about 37 billion gallons of fresh water are flushed down toilets every day, say the inventors of the new product who published the results of their work this week in the journal Nature Sustainability. The reason we send so much water down the drain? It takes a lot of water to get rid of the bulk of our waste. Or as the authors put it in the paper: “human faeces is viscoelastic and sticky in nature, causing it to adhere to conventional surfaces.”

If people could make toilets more slippery, less water would be needed to get the results of those bowel movements moving down the drain. That’s where the new liquid-entrenched smooth surface (LESS) comes in. LESS consists of two sprayable coatings that can be applied to carbon steel, ceramics, or other hard surfaces. The first spray dries into thin, hair-like structures so small that they aren’t visible to the naked eye. The second is a lubricant that coats those “hairs,” making waste, water, and even bacteria slide off easily.


This is a great signal about the power of visuals to enable us to think the unthinkable.

How to turn the complex mathematics of vector calculus into simple pictures

Feynman diagrams revolutionized particle physics. Now mathematicians want to do the same for vector calculus.
Back in 1948, the journal Physical Review published a paper entitled “Space-Time Approach to Quantum Electrodynamics” by a young physicist named R.P. Feynman at Cornell University. The paper described a new way to solve problems in electrodynamics using matrices. However, it is remembered today for a much more powerful invention—the Feynman diagram, which appeared there in print for the first time.

Feynman diagrams have had a huge impact in physics. They are pictorial representations of the mathematics that describe the interaction between subatomic particles. Mathematically, each interaction is an infinite series, so even simple interactions between particles are fantastically complex to write down in this way.

Feynman’s genius was to represent these series with simple lines in a graphical format, allowing scientists to think about particle physics in new and exciting ways.

Feynman and others immediately began to extend their ideas using this graphical shorthand. Indeed, the American physicist Frank Wilcjek, who worked with Feynman in the 1980s, once wrote:  “The calculations that eventually got me a Nobel Prize in 2004 would have been literally unthinkable without Feynman diagrams.“

… every physics and engineering undergraduate spends many happy hours struggling with the mathematics and the arcane notation that it requires. The problem is that vector fields are intricate entities—they assign a single vector to every point in three-dimensional space and can themselves be representations of more complex mathematical objects called differentiable manifolds. So at its very simplest, a vector field can be an infinite list of vectors.


Here is how to create 3D visuals that can be felt. The world of mixed reality (including digitally connected xxx experiences) is emerging ever nearer. There is a 5 min video illustrating the current state.

Star Wars-style 3D images created from single speck of foam

Ultrasonic speakers steer tiny bead to generate displays that you can touch and hear.
“It’s an elegant and exciting platform,” says Daniel Smalley, a physicist at Brigham Young University in Provo, Utah, who last year unveiled a similar technique, using lasers to steer around a fleck of cellulose to produce images. Until now, few physicists thought it would be possible to use sound to move a bead fast enough to create such a display, he says. In August, Tatsuki Fushimi, a physicist at the University of Bristol, UK, and his collaborators became the first to show that it was feasible. But their bead took longer to trace out shapes, meaning that only images smaller than 1 centimetre across could appear as a single, continuous object2. The Sussex team’s work is “a piece of engineering that makes us believe things we didn’t think were possible,” says Smalley.

The acoustic device, described in Nature on 13 November, is the latest example of a 3D-image-generation technology known as volumetric display, which differs in fundamental ways from technologies such as holograms, virtual reality and stereoscopes. Those more-familiar approaches use tricks of the light to create the illusion of depth, and can be life-sized and photorealistic. But holograms can be seen only from certain angles, virtual reality and stereoscopes require headgear, and all these techniques can cause eye strain. Free-space volumetric displays, by contrast, use lasers, electric fields, fog projections and other approaches to create truly 3D images that viewers can see from any vantage point. In that way, they’re the closest any display technology has come to Princess Leia’s SOS message in the 1977 film Star Wars.