Thursday, May 14, 2020

Friday Thinking 15 May 2020


Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.)  that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.

Many thanks to those who enjoy this.

In the 21st Century curiosity will SKILL the cat.

Jobs are dying - Work is just beginning.
Work that engages our whole self becomes play that works.
Techne = Knowledge-as-Know-How :: Technology = Embodied Know-How  
In the 21st century - the planet is the little school house in the galaxy.
Citizenship is the battlefield of the 21st  Century

“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9

Content
Quotes:

Articles:



The Many Vital Uses of Sugars
There’s a reason why genomics and proteomics have leapt ahead of glycomics: The sheer complexity of sugars makes them more difficult to study. DNA, RNA and proteins are linear molecules built according to defined sets of rules, and scientists have the tools to sequence, analyze and manipulate them. But glycans are branching structures that assemble without a known template. The same site on two identical proteins might be occupied by very different glycans, for instance. Glycans also have exponentially more potential configurations than DNA or proteins: Three different nucleotides can make six distinct DNA sequences; three amino acids can make six unique peptides; three glycan building blocks can form more than a thousand structures. Glycans are flexible, wobbly and variable; intricate, dynamic and somewhat unpredictable. Their analysis demands greater technical expertise and more sophisticated equipment.

Without taking sugars into account, we can’t fully understand how proteins and cells interact and function. “Imagine a world in which each of us knew only a fraction of the alphabet,” wrote Jamey Marth, a molecular biologist at the University of California, Santa Barbara, in a recent letter to Nature Cell Biology.

Sugary Camouflage on Coronavirus Offers Vaccine Clues




In complex environments, the way to proficiency is to recombine successful elements to create new versions, some of which may thrive.

the new networks involve relations of dynamic interdependence. …  Minimal hierarchy, organizational diversity and responsiveness characterize these architectures. They are a necessary response to the increasing fuzziness of strategic horizons and short half-life of designs. Because of greater complexity, coordination cannot be planned in advance. Authority needs to be distributed; it is no longer delegated vertically but emerges horizontally. Under distributed authority work teams and knowledge workers need to be accountable to other work teams and other knowledge workers. Achievement depends on learning by mutual accountability and responsiveness.

Success is based on continuous redefinition of the organization itself. It is about recombining options and contributions in a competitive and cooperative environment. Creativity is the default state of all human work. Even the most creative people are more remixers of other peoples’ ideas than lone inventors. Technology and development in general are not isolated acts by independent thinkers, but a complex storyline.

Collaborative and Competitive Creativity




For decades, Lovelock has warned of the global heating that will permanently alter human and nonhuman ways of life. His recent publications reveal an understanding, shared with Spinoza, that these natural transformations are profoundly amoral. Gaia strives to preserve itself, to preserve life as such: Gaia, God or nature doesn’t have any interest in preserving this or that species, or any particular configuration of the Earth. Lovelock also shares with Spinoza the understanding that human transformations of the Earth are part of nature, however much we might think of certain actions as harming or destroying nature. By seeking our own advantage and transforming our environment, human beings don’t destroy nature: we are nature, transforming itself. The effects of these activities are, from nature’s point of view, neither good nor bad.

For Spinoza, ‘power’ (potentia) refers to a thing’s capacity to be what it is and to act from its nature. Each thing strives to persevere in its being, to seek its own advantage, and to do those things that follow from its nature to achieve these ends. Spinoza links power to virtue and explains that the more we strive for what is advantageous to our being and acting, the more powerful, and therefore virtuous, we are. We seek to do those things that preserve our being and increase our power to act. As we act in ways that are good for us but bad for nature, it is easy to see how our power appears fearful. Seeking to preserve our being and increase our power to act involves being bound up in complex systems of energy extraction and food production that contribute directly to climate change.

Fear of our own power means that we fear what follows from our own nature: we fear our essential tendency to seek our advantage. We fear our power only to the extent that we doubt its effects. Fear is a sadness that arises from imagining an uncertain future outcome. Once doubt about the future has been removed, fear becomes either despair or confidence. Where there is doubt about the outcome of seeking our advantage, we fear our power; where there is no doubt that our actions are destructive, we despair of it.

We are nature




The Top Five Regrets of the Dying were:
I wish I’d had the courage to live a life true to myself, not the life others expected of me;
I wish I hadn’t worked so hard;
I wish I’d had the courage to express my feelings;
I wish I had stayed in touch with my friends; and
I wish that I had let myself be happier.

Sooner or later we all face death. Will a sense of meaning help us?





This is a long read - but well worth it.
The first lesson a disaster teaches is that everything is connected. In fact, disasters, I found while living through a medium-sized one (the 1989 earthquake in the San Francisco Bay Area) and later writing about major ones (including 9/11, Hurricane Katrina and the 2011 Tōhoku earthquake and Fukushima nuclear catastrophe in Japan), are crash courses in those connections.
I have found over and over that the proximity of death in shared calamity makes many people more urgently alive, less attached to the small things in life and more committed to the big ones, often including civil society or the common good.
Hope offers us clarity that, amid the uncertainty ahead, there will be conflicts worth joining and the possibility of winning some of them. And one of the things most dangerous to this hope is the lapse into believing that everything was fine before disaster struck, and that all we need to do is return to things as they were.

'The impossible has already happened': what coronavirus can teach us about hope

In the midst of fear and isolation, we are learning that profound, positive change is possible.
Disasters begin suddenly and never really end. The future will not, in crucial ways, be anything like the past, even the very recent past of a month or two ago. Our economy, our priorities, our perceptions will not be what they were at the outset of this year. The particulars are startling: companies such as GE and Ford retooling to make ventilators, the scramble for protective gear, once-bustling city streets becoming quiet and empty, the economy in freefall. Things that were supposed to be unstoppable stopped, and things that were supposed to be impossible – extending workers’ rights and benefits, freeing prisoners, moving a few trillion dollars around in the US – have already happened.

The word “crisis” means, in medical terms, the crossroads a patient reaches, the point at which she will either take the road to recovery or to death. The word “emergency” comes from “emergence” or “emerge”, as if you were ejected from the familiar and urgently need to reorient. The word “catastrophe” comes from a root meaning a sudden overturning.

We have reached a crossroads, we have emerged from what we assumed was normality, things have suddenly overturned. One of our main tasks now – especially those of us who are not sick, are not frontline workers, and are not dealing with other economic or housing difficulties – is to understand this moment, what it might require of us, and what it might make possible.

A disaster (which originally meant “ill-starred”, or “under a bad star”) changes the world and our view of it. Our focus shifts, and what matters shifts. What is weak breaks under new pressure, what is strong holds, and what was hidden emerges. Change is not only possible, we are swept away by it. We ourselves change as our priorities shift, as intensified awareness of mortality makes us wake up to our own lives and the preciousness of life. Even our definition of “we” might change as we are separated from schoolmates or co-workers, sharing this new reality with strangers. Our sense of self generally comes from the world around us, and right now, we are finding another version of who we are.

As the pandemic upended our lives, people around me worried that they were having trouble focusing and being productive. It was, I suspected, because we were all doing other, more important work. When you’re recovering from an illness, pregnant or young and undergoing a growth spurt, you’re working all the time, especially when it appears you’re doing nothing. Your body is growing, healing, making, transforming and labouring below the threshold of consciousness. As we struggled to learn the science and statistics of this terrible scourge, our psyches were doing something equivalent. We were adjusting to the profound social and economic changes, studying the lessons disasters teach, equipping ourselves for an unanticipated world.

When a caterpillar enters its chrysalis, it dissolves itself, quite literally, into liquid. In this state, what was a caterpillar and will be a butterfly is neither one nor the other, it’s a sort of living soup. Within this living soup are the imaginal cells that will catalyse its transformation into winged maturity. May the best among us, the most visionary, the most inclusive, be the imaginal cells – for now we are in the soup. The outcome of disasters is not foreordained. It’s a conflict, one that takes place while things that were frozen, solid and locked up have become open and fluid – full of both the best and worst possibilities. We are both becalmed and in a state of profound change.….. 


For anyone wanting an excellent home made mask - this article has a link to DIY instructions. 
The name N95 comes from the fact the masks filter 95% of airborne particles, such as viruses. Lab results show Tommye's masks block 96.5%.

This nurse didn't just create a replacement N95 mask—hers works better

As Tommye Austin made her way around the COVID-19 unit, she saw patients on ventilators fighting for each breath. She heard nurses, respiratory therapists and other workers talking about how anxious they were about being exposed to the coronavirus, and perhaps spreading it to their loved ones.

N95s weren't intended for all-day use, so they tend to carve painful, unsightly marks into noses, cheeks and chins. Hers don't.
With nowhere for exhaled carbon dioxide to escape, N95 wearers sometimes suffer dizziness or headaches. Hers have an air pocket so the C02 floats away.

The "TM 2020"—the letters stand for "Tommye Mask"—took about 10 days to create. And 24 hours to become an internet sensation.

The hospital was so inundated with requests for details that it posted step-by-step instructions later that week. (unlike Tommye Scribd requires you to sign up for a free account if you want to download the instructions). 


This is a vitally important signal for adaptive re-imagining of our political economies and the future - not of work - but of how to enable every single person to create value in the digital environment of the rapidly emerging future.

Finnish basic income pilot improved wellbeing, study finds

First major study of scheme comes as economic toll of coronavirus prompts fresh interest in idea
Europe’s first national, government-backed basic income experiment did not do much to encourage recipients into work but did improve their mental wellbeing, confidence and life satisfaction, according to the first big study of a Finnish scheme that has attracted fresh interest in the coronavirus outbreak.

“The basic income recipients were more satisfied with their lives and experienced less mental strain than the control group,” the study, by researchers at Helsinki University, concluded. “They also had a more positive perception of their economic welfare.”

The study comes as the devastating economic fallout from the coronavirus crisis - including soaring unemployment worldwide - sparks renewed interest in basic income schemes. The pope suggested in his Easter address that “this may be the time to consider a universal basic wage”.

The Spanish government said last month it aimed to roll out a basic income “as soon as possible” to about a million of the country’s poorest households, with the economic affairs minister, Nadia Calviño, saying the Socialist-led government hoped a universal basic income would become “a permanent instrument”.

Scotland’s first minister, Nicola Sturgeon, said this week the virus and its economic consequences had “made me much, much more strongly of the view that [universal basic income] is an idea that’s time has come”.


The adult entertainment industries have always been on the cutting edge or technology - adopting before other industries technologies, strategies. They may provide some important lessons for the next phase of the COVID-19 pandemic - which will be challenging and vital to succeed.
“In many ways, what they are doing is a model for what we are trying to do with Covid,” said Ashish Jha, a physician who directs Harvard University’s Global Health Institute and has been calling for widespread national testing to contain the coronavirus. “The adult film industry teaches us that as a proof of concept, this can work. We just have to scale it up.”
Performers say they have a lot to teach the rest of the post-Covid-19 world. “We’re already used to working in an environment of risk. That’s something the rest of the normal world is just learning to do,” 

Why the porn industry has a lot to teach us about safety in the Covid-19 era

LOS ANGELES — As states and employers furiously develop plans to safely reopen workplaces in the midst of the coronavirus pandemic, they’re grappling with what seems like an endless list of questions: where to test, who to test, and how often to test for the virus? Further complicating matters are issues of workers’ privacy, geography, politics, science, and cost. It’s a difficult mandate. But there is one place to look for guidance — the adult film industry.

Since the late 1990s, when an outbreak of HIV infections threatened to shutter the multibillion-dollar industry, the mainstream porn community has implemented procedures that require all performers to be tested for HIV and a host of other sexually transmitted infections every 14 days before they can be cleared to work. Any HIV-positive test leads to an immediate shutdown of all U.S. sets, followed by detailed contact tracing before sets can reopen. While not perfect, those in the industry say the nationwide PASS program works to protect thousands of performers, ensures safer workplaces, and curtails the spread of disease.

In the 20 years it has been in place, PASS has met, and overcome, many of the same challenges that any large-scale coronavirus testing program might encounter, from issues of keeping databases of private medical information secure, preventing the forging of test results, dealing with false positive results, and educating workers about the need for repeated testing to keep workplaces safe. Those devising strategies to reopen workplaces and the larger economy during the coronavirus pandemic say their plans would involve, at their core, processes of rigorous testing, isolation, and contact tracing similar to those used in the adult film industry.


We are learning so much about biology - if only we could treat ‘fat heads’. This is a promising signal for enabling better health for humans suffering abundance of food. 
"We've developed a proof of concept here that you can regulate weight gain by modulating the activity of these inflammatory cells," said principal investigator Steven L. Teitelbaum, MD, the Wilma and Roswell Messing Professor of Pathology & Immunology. "It might work in a number of ways, but we believe it may be possible to control obesity and the complications of obesity by better regulating inflammation."

Obesity prevented in mice treated with gene-disabling nanoparticles

Disabling a gene in specific mouse cells, researchers at Washington University School of Medicine in St. Louis have prevented mice from becoming obese, even after the animals had been fed a high-fat diet.
The researchers blocked the activity of a gene in immune cells. Because these immune cells—called macrophages—are key inflammatory cells and because obesity is associated with chronic low-grade inflammation, the researchers believe that reducing inflammation may help regulate weight gain and obesity.

The study is published May 1 in The Journal of Clinical Investigation.

When people are obese, they burn fewer calories than those who are not obese. The same is true for mice. But according to co-first author Wei Zou, MD, Ph.D., assistant professor of pathology and immunology, the researchers found that obese mice maintained the same level of calorie burning as mice that were not obese—after the research team deleted the ASXL2 gene in the macrophages of the obese mice and, in a second set of experiments, after they injected the animals with nanoparticles that interfere with the gene's activity.


This is an important signal - relating to the death of Moore’s Law - Long Live Moore’s Law. Yes - this is not specifically about integrated circuits and chips - but it is about exponential progress in computational capabilities.

AI and Efficiency

We’re releasing an analysis showing that since 2012 the amount of compute needed to train a neural net to the same performance on ImageNet classification has been decreasing by a factor of 2 every 16 months. Compared to 2012, it now takes 44 times less compute to train a neural network to the level of AlexNet (by contrast, Moore’s Law would yield an 11x cost improvement over this period). Our results suggest that for AI tasks with high levels of recent investment, algorithmic progress has yielded more gains than classical hardware efficiency.

Algorithmic improvement is a key factor driving the advance of AI. It’s important to search for measures that shed light on overall algorithmic progress, even though it’s harder than measuring such trends in compute.


Beauty is in the eye of the beholder. This is a great signal of emerging capacity to see more than ever before - even the movement of light.
"We envision applications in a rich variety of extremely fast phenomena, such as ultrashort light propagation, wave propagation, nuclear fusion, photon transport in clouds and biological tissues, and fluorescent decay of biomolecules, among other things," 

New ultrafast camera takes 70 trillion pictures per second

The new camera developed in the lab of Lihong Wang, Bren Professor of Medical Engineering and Electrical Engineering in the Andrew and Peggy Cherng Department of Medical Engineering, is capable of taking as many as 70 trillion frames per second. That is fast enough to see waves of light traveling and the fluorescent decay of molecules.

The camera technology, which Wang calls compressed ultrafast spectral photography (CUSP), is similar in some respects to previous fast cameras he has built, such as his phase-sensitive compressed ultrafast photography, or pCUP, device, which can take 1 trillion frames per second of transparent objects and phenomena.


This is another signal of the advancing trajectory of human ability to sense - in this case visually.
"It may be possible for a modified mobile phone to have images of the nano-world,"
Through very precise control of these LEDs, which are about 1,000 times smaller than the diameter of a human hair, the ChipScope imaging system has a spatial resolution of just below 200 nanometres—which is the usual limit with visible light.
It could readily provide images of airborne nanoparticles, including those smaller than 2.5 microns, considered the most dangerous to human health.

'Microscope on a chip' could bring medical expertise to distant patients

Scientists are reducing the size and costs of medical microscopes to make it possible to use them more widely, and hook them up to experts able to diagnose an illness even from far away.

Falling ill in a remote part of the world can mean difficulty in finding the right care. Even where medical help may be available, it may be impossible to get a definitive diagnosis because of a lack of specialist expertise and laboratory equipment, such as microscopes.

Advanced miniaturisation means patients could benefit from a 'microscope on a chip," according to Dr. Angel Dieguez, senior lecturer in the Department of Electronics and Biomedical Engineering at the University of Barcelona, Spain.

He runs a project called ChipScope, which uses some of the smallest light sources ever manufactured to push the limits of conventional optics, in a device potentially compact enough to fit in a pocket.

Dr. Dieguez estimates the microscope and control electronics cost less than €1,000 for the prototype being developed, and that further development and economies of scale could bring this down to as little as €10 or so.


This is a weak signal - but is seriously important of the trajectory of new science breakthroughs and new ways of sensing.

Scientists demonstrate quantum radar prototype

Physicists at the Institute of Science and Technology Austria (IST Austria) have invented a new radar prototype that uses quantum entanglement as a method of object detection. This successful integration of quantum mechanics into devices could significantly impact the biomedical and security industries. The research is published in the journal Science Advances.

Now, scientists from the research group of Professor Johannes Fink at the Institute of Science and Technology Austria (IST Austria) along with collaborators Stefano Pirandola from the Massachusetts Institute of Technology (MIT) and the University of York, UK, and David Vitali from the University of Camerino, Italy—have demonstrated a new type of detection technology called microwave quantum illumination that utilizes entangled microwave photons as a method of detection. The prototype, which is also known as a quantum radar, is able to detect objects in noisy thermal environments where classical radar systems often fail. The technology has potential applications for ultra-low power biomedical imaging and security scanners.

Using quantum entanglement as a new form of detection
The working principles behind the device are simple: Instead of using conventional microwaves, the researchers entangle two groups of photons, which are called the signal and idler photons. The signal photons are sent out towards the object of interest, whilst the idler photons are measured in relative isolation, free from interference and noise. When the signal photons are reflected back, true entanglement between the signal and idler photons is lost, but a small amount of correlation survives, creating a signature or pattern that describes the existence or the absence of the target object—irrespective of the noise within the environment.


Understanding ourselves and any part of ourselves (e.g. the brain) has always depended on some metaphor - from an internal homunculus to a mechanical automaton to a computer. Advances always arise when we can evolve our metaphors - especially when incorporating the necessity of error in learning.

Researchers develop a new model for how the brain processes complex information

The human brain is a highly advanced information processor composed of more than 86 billion neurons. Humans are adept at recognizing patterns from complex networks, such as languages, without any formal instruction. Previously, cognitive scientists tried to explain this ability by depicting the brain as a highly optimized computer, but there is now discussion among neuroscientists that this model might not accurately reflect how the brain works.

Now, Penn researchers have developed an different model for how the brain interprets patterns from complex networks. Published in Nature Communications, this new model shows that the ability to detect patterns stems in part from the brain's goal to represent things in the simplest way possible. Their model depicts the brain as constantly balancing accuracy with simplicity when making decisions. The work was conducted by physics Ph.D. student Christopher Lynn, neuroscience Ph.D. student Ari Kahn, and professor Danielle Bassett.

This new model is built upon the idea that people make mistakes while trying to make sense of patterns, and these errors are essential to get a glimpse of the bigger picture. "If you look at a pointillist painting up close, you can correctly identify every dot. If you step back 20 feet, the details get fuzzy, but you'll gain a better sense of the overall structure," says Lynn.


This is still a weak signal - but also an important indicator of progress in domesticating photosynthesis.

Cyber-spinach turns sunlight into sugar

Combination of biological membrane and artificial chemistry could power future synthetic organisms.
There’s a new way to eat carbon dioxide. Researchers have built an artificial version of a chloroplast, the photosynthetic structures inside plant cells. It uses sunlight and a laboratory-designed chemical pathway to turn CO2 into sugar.

Artificial photosynthesis could be used to drive tiny, non-living, solar-powered factories that churn out therapeutic drugs. And because the new chemical pathway is more efficient than anything nature has evolved, the team hopes that a similar process could some day even help to remove CO2 from the atmosphere — although it is not clear whether it could be turned into a large-scale, economically feasible operation. The work was published in Science on 7 May

Although it’s just a proof of principle, it’s already possible to think of ways in which the artificial chloroplasts could be put to work, the authors say. Because of advances in synthetic biology, microbes can now be engineered to churn out useful molecules such as pharmaceutical drugs. But there are limits to what can be synthesized inside living cells. Erb says that the artificial chloroplasts could power non-living mini-reactors to produce molecules that living cells cannot.

Thursday, May 7, 2020

Friday Thinking 8 May 2020

Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.)  that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.

Many thanks to those who enjoy this.

In the 21st Century curiosity will SKILL the cat.

Jobs are dying - Work is just beginning.
Work that engages our whole self becomes play that works.
Techne = Knowledge-as-Know-How :: Technology = Embodied Know-How  
In the 21st century - the planet is the little school house in the galaxy.
Citizenship is the battlefield of the 21st  Century

“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9


Content
Quotes:


Articles



In the spring of 1822 an employee in one of the world’s first offices – that of the East India Company in London – sat down to write a letter to a friend. If the man was excited to be working in a building that was revolutionary, or thrilled to be part of a novel institution which would transform the world in the centuries that followed, he showed little sign of it. “You don’t know how wearisome it is”, wrote Charles Lamb, “to breathe the air of four pent walls, without relief, day after day, all the golden hours of the day between ten and four.” His letter grew ever-less enthusiastic, as he wished for “a few years between the grave and the desk”. No matter, he concluded, “they are the same.”

The world that Lamb wrote from is now long gone. The infamous East India Company collapsed in ignominy in the 1850s. Its most famous legacy, British colonial rule in India, disintegrated a century later. But his letter resonates today, because, while other empires have fallen, the empire of the office has triumphed over modern professional life.

Even before coronavirus struck, the reign of the office had started to look a little shaky. A combination of rising rents, the digital revolution and increased demands for flexible working meant its population was slowly emigrating to different milieux. More than half of the Ameri­can workforce already worked remotely, at least some of the time. Across the world, home working had been rising steadily for a decade. Pundits predicted that it would increase further. No one imagined that a dramatic spike would come so soon.

It’s too early to say whether the office is done for. As with any sudden loss, many of us find our judgment blurred by conflicting emotions. Relief at freedom from the daily commute and pleasure at turning one’s back on what Philip Larkin called “the toad work” are tinged with regret and nostalgia, as we prepare for another shapeless day of WFH in jogging bottoms.

The most transformatory aspect of offices was less the buildings themselves than the sheer amount of time we spent in them. This would have seemed alien to many earlier societies. Mary Beard, professor of classics at Cambridge University, notes that elite Romans strived to switch off as much as possible. “Our division between leisure and work is reversed in the Roman world. What we mostly do is work and, when we’re not working, we’re at leisure.” In Rome it was the other way round for the elite: “The normal state of play is otium, it’s leisure. And sometimes, you’re not at leisure, you’re doing business, which is negotium.” Though the English word “business” has an inbuilt aura of action and industry, the Latin neg-otium – literally “not leisure” – has an almost grudging sense of pleasure denied.

When time-and-motion studies examine offices today, their results can be dispiriting. Office-work takes up not merely the bulk of our time but the best part of it, the hours when we are alert and alive. Home, and its occupants, has the husk.

Death of the office



Why are ‘apocalyptic’ stories of civilisational collapse so appealing in contrast with the more complex and nuanced narratives tentatively suggested by many archaeologists? At least since the early 20th century, we have been looking forward to the end, on a global scale. Spanning popular and academic culture, the father of modern futurology, H G Wells, announced in a 1902 lecture to the Royal Institution in London that:

It is impossible to show why certain things should not utterly destroy and end the human race and story; why night should not presently come down and make all our dreams and efforts vain … something from space, or pestilence, or some great disease of the atmosphere, some trailing cometary poison, some great emanation of vapour from the interior of the Earth, or new animals to prey on us, or some drug or wrecking madness in the mind of man.

Stories of mass destruction, societal breakdown and civilisational collapse run deep in our culture, from Sodom and Gomorrah, destroyed by a wrathful god, to the destruction of Atlantis, submerged under the sea after a massive earthquake. No matter whether literally true or not, these remain two of the most well-known stories in our society – dramatic and vivid, easy to imagine and see. The destruction of Pompeii has captivated audiences for centuries, spawning theatrical reconstructions known as ‘volcano entertainments’, replete with dancers, fireworks and an erupting volcano, novels such as Sir Edward Bulwer-Lytton’s bestselling novel The Last Days of Pompeii (1834), and feature films and documentaries, in addition to the many popular and scholarly books.

Do civilisations collapse?



The emergence of a global intelligent smart logistics network seems inevitable on our good days. Such is a signal of the possibility of a shift to a global consciousness and also the possibilities of global surveillance. This is a good article exploring the supply chain.
I set out to find the answer, and what I found surprised me. We consumers are not the only ones afflicted with this selective blindness. The corporations that make use of supply chains experience it too. And this partial sight, erected on a massive scale, is what makes global capitalism possible.
Supply chains are robust precisely because they’re decentralized and self-healing. In this way, these physical infrastructures distributed all over the world are very much like the invisible network that makes them possible: the internet.
Walmart demands perfect control over certain aspects of its supply chain, like price and delivery times, while at the same time refusing knowledge about other aspects, like labor practices and networks of subcontractors. 
This peculiar state of knowing-while-not-knowing is not the explicit choice of any individual company but a system that’s grown up to accommodate the variety of goods that we demand, and the speed with which we want them. It’s embedded in software, as well as in the container ships that are globalization’s most visible emblem.

See No Evil

Software helps companies coordinate the supply chains that sustain global capitalism. How does the code work—and what does it conceal?
I had heard similar claims about other industries. There was the Fairphone, which aimed at its launch in 2013 to be the first ethically produced smartphone, but admitted that no one could guarantee a supply chain completely free from unfair labor practices. And of course one often hears about exploitative labor practices cropping up in the supply chains of companies like Apple and Samsung: companies that say they make every effort to monitor labor conditions in their factories.


Putting aside my cynicism for the moment, I wondered: What if we take these companies at their word? What if it is truly impossible to get a handle on the entirety of a supply chain?


The thing that still confused me is how reliable supply chains are, or seem to be. The world is unpredictable—you’ve got earthquakes, labor strikes, mudslides, every conceivable tragedy—and yet as a consumer I can pretty much count on getting what I want whenever I want it. How can it be possible to predict a package’s arrival down to the hour, yet know almost nothing about the conditions of its manufacture?


In the past twenty years, popular and academic audiences have taken a growing interest in the physical infrastructure of global supply chains. The journalist Alexis Madrigal’s Containers podcast took on the question of how goods travel so far, so quickly. The writer Rose George traveled the world on a container ship for her book Ninety Percent of Everything. And Marc Levinson’s The Box startled Princeton University Press by becoming a national bestseller. Most recently, Deborah Cowen’s The Deadly Life of Logistics offered a surprisingly engrossing history of that all-important industry.


while workplace abuses get a lot of attention when they take place in the supply chains of large, prestigious companies like Apple and Samsung, working conditions are actually most opaque and labor abuse is most rampant in other industries, like apparel and agriculture. “Apparel, every quarter they have 100 percent turnover in the clothing that they make, so it’s a whole new supply chain every season. And with food, there’s millions of farmers involved. So in these places, where there’s way too many nodes for anyone to see without a computer, and where the chain changes by the time you’ve monitored it—those are the places where we see a lot of problems and instability.”


This is an excellent account of the rise, spread and mutations of the COVID-19 virus. It also signals the progress and speed that the capacity of gene sequencing has made. The graphics are accessible and worth the view.

How Coronavirus Mutates and Spreads

The coronavirus is an oily membrane packed with genetic instructions to make millions of copies of itself. The instructions are encoded in 30,000 “letters” of RNA — a, c, g and u — which the infected cell reads and translates into many kinds of virus proteins.


A New Coronavirus Dec. 26
In December, a cluster of mysterious pneumonia cases appeared around a seafood market in Wuhan, China. In early January, researchers sequenced the first genome of a new coronavirus, which they isolated from a man who worked at the market. That first genome became the baseline for scientists to track the SARS-CoV-2 virus as it spreads around the world.


This is another good article explaining some of the confusion about SARS-CoV-2 virus.
SARS-CoV-2 is the virus. COVID-19 is the disease that it causes. The two aren’t the same. The disease arises from a combination of the virus and the person it infects, and the society that person belongs to. Some people who become infected never show any symptoms; others become so ill that they need ventilators. Early Chinese data suggested that severe and fatal illness occurs mostly in the elderly, but in the U.S. (and especially in the South), many middle-aged adults have been hospitalized, perhaps because they are more likely to have other chronic illnesses. The virus might vary little around the world, but the disease varies a lot.

Why the Coronavirus Is So Confusing

A guide to making sense of a problem that is now too big for any one person to fully comprehend
….much else about the pandemic is still maddeningly unclear. Why do some people get really sick, but others do not? Are the models too optimistic or too pessimistic? Exactly how transmissible and deadly is the virus? How many people have actually been infected? How long must social restrictions go on for? Why are so many questions still unanswered?


The confusion partly arises from the pandemic’s scale and pace. Worldwide, at least 3.1 million people have been infected in less than four months. Economies have nose-dived. Societies have paused. In most people’s living memory, no crisis has caused so much upheaval so broadly and so quickly. “We’ve never faced a pandemic like this before, so we don’t know what is likely to happen or what would have happened,” says Zoë McLaren, a health-policy professor at the University of Maryland at Baltimore County. “That makes it even more difficult in terms of the uncertainty.”


But beyond its vast scope and sui generis nature, there are other reasons the pandemic continues to be so befuddling—a slew of forces scientific and societal, epidemiological and epistemological. What follows is an analysis of those forces, and a guide to making sense of a problem that is now too big for any one person to fully comprehend. ...


Deaths are hard to tally in general, and the process differs among diseases. The CDC estimates that flu kills 24,000 to 62,000 Americans every year, a number that seems superficially similar to the 58,000 COVID-19 deaths thus far. That comparison is misleading. COVID-19 deaths are counted based either on a positive diagnostic test for the coronavirus or on clinical judgment. Flu deaths are estimated through a model that looks at hospitalizations and death certificates, and accounts for the possibility that many deaths are due to flu but aren’t coded as such. If flu deaths were counted like COVID-19 deaths, the number would be substantially lower. This doesn’t mean we’re overestimating the flu. It does mean we are probably underestimating COVID-19.


A brief article providing some tips on making homemade masks and some data on their effectiveness.

Homemade masks made of silk and cotton may boost protection: study

In the wake of the COVID-19 pandemic, the U.S. Centers for Disease Control and Prevention recommends that people wear masks in public. Because N95 and surgical masks are scarce and should be reserved for health care workers, many people are making their own coverings out of fabric. Now, a preliminary study published in ACS Nano by University of Chicago and Argonne National Laboratory researchers suggests that a combination of masks made of high thread-count cotton with natural silk fabric or a chiffon weave can effectively filter out aerosol particles––if the fit is good.


Though the study does not attempt to replicate real-world conditions, the findings are a useful guide. The researchers pointed out that tightly woven fabrics, such as cotton, can act as a mechanical barrier to particles; whereas fabrics that hold a static charge, like certain types of chiffon and natural silk, can serve as an electrostatic barrier. The electrostatic effect serves to suck in and hold the tiniest particles, which might otherwise slip through holes in the cotton. This is key to how N95 masks are constructed.


However, Guha added, even a small gap reduced the filtering efficiency of all masks by half or more, emphasizing the importance of a properly fitted mask.


This is an important signal for the role our microbial profile plays in maintaining  a robust immune system.

Fecal transplantation improves outcomes in patients with multi-drug resistant organisms

Transferring fecal matter from the digestive systems of healthy donors to extremely ill patients who had previously been infected with drug-resistant bacteria resulted in shorter hospital stays, fewer bloodstream infections and infections that were easier to treat, according to research that was selected for presentation at Digestive Disease Week (DDW) 2020. DDW data will be published in the May online supplements to Gastroenterology and GIE: Gastrointestinal Endoscopy.


The study's researchers conducted the transfer, known as fecal microbial transplantation or FMT, in 20 patients infected during extensive medical care with multi-drug resistant organisms, including carbapenemase-producing Enterobacteriaceae (such as Escherichia coli), vancomycin-resistant enterococci or extended-spectrum beta-lactamase Enterobacteriaceae. Patients were followed for six months after the transplantation and their clinical course compared with six months prior to FMT.


While the resistant bacteria were cleared in only 41 percent of the 17 patients who completed the full follow-up, researchers found other benefits to the patients, who had been repeatedly hospitalized and treated for a variety of severe conditions. The sample included hematological cancer patients in need of stem-cell transplants and kidney transplant patients with urinary and bloodstream infections with bacteria that were multi-drug resistant.


This is a fascinating signal of our growing understanding of the bioecology that maintains our health and sense of self. If the immune system is the fundamental system that recognizes ‘self’ vs ‘other’ it is interesting how much ‘self depends on other’.

Scientists study growth rate effect of gut bacteria on degradation of dietary fiber

It is known that approximately 80% of the human immune system functions within the gastrointestinal tract. Gut bacteria and their metabolites play a fundamental role in the interaction between gut and other organs. Since the organic acids produced by colon bacteria (acetate, lactate, propionate, succinate and butyrate) activate a number of immune and hormonal processes, the microbiota composed of hundreds of different bacterial species is of vital importance for the normal functioning and health of the human body.


Senior Researcher Kaarel Adamberg, head of the Microbiomics Research Group of TalTech Department of Chemistry and Biotechnology, says, "Food is a crucial factor in modulating the gut microbiota and its metabolism. Very important nutrients for the colon bacteria are the dietary compounds that are not broken down by enzymes in the stomach and small intestine and, thus, reach the large intestine. A person requires at least 25 to 35 grams of fibre a day for normal bowel function. Water-binding fibers also promote the movement of food in the digestive tract."


This is an amazing signal that offers promise in the challenges of malaria as well as signaling progress in understanding disease ecologies. 

Malaria 'completely stopped' by microbe

Scientists have discovered a microbe that completely protects mosquitoes from being infected with malaria.
The team in Kenya and the UK say the finding has "enormous potential" to control the disease.
Malaria is spread by the bite of infected mosquitoes, so protecting them could in turn protect people.
The researchers are now investigating whether they can release infected mosquitoes into the wild, or use spores to suppress the disease.


The malaria-blocking bug, Microsporidia MB, was discovered by studying mosquitoes on the shores of Lake Victoria in Kenya. It lives in the gut and genitals of the insects.


The researchers could not find a single mosquito carrying the Microsporidia that was harbouring the malaria parasite. And lab experiments, published in Nature Communications, confirmed the microbe gave the mosquitoes protection.


Microsporidias are fungi, or at least closely related to them, and most are parasites.
However, this new species may be beneficial to the mosquito and was naturally found in around 5% of the insects studied.


When does something become part of our sense of self? Maybe when it becomes part of our sensorium.
"Our study shows that a prosthetic hand attached to the bone and controlled by electrodes implanted in nerves and muscles can operate much more precisely than conventional prosthetic hands. We further improved the use of the prosthesis by integrating tactile sensory feedback that the patients use to mediate how hard to grab or squeeze an object. Over time, the ability of the patients to discern smaller changes in the intensity of sensations has improved," says Max Ortiz Catalan.
Since receiving their prostheses, the patients have used them daily in all their professional and personal activities.

Mind-controlled arm prostheses that can 'feel'

For the first time, people with arm amputations can experience sensations of touch in a mind-controlled arm prosthesis that they use in everyday life. A study in the New England Journal of Medicine reports on three Swedish patients who have lived for several years with this new technology, one of the world's most integrated interfaces between humans and machines.


The advance is unique: The patients have used a mind-controlled prosthesis in everyday life for up to seven years. For the last few years, they have also lived with a new function—sensations of touch in the prosthetic hand. This is a new concept for artificial limbs, which are called neuromusculoskeletal prostheses, as they are connected to the user's nerves, muscles and skeleton.


"The most important contribution of this study was to demonstrate that this new type of prosthesis is a clinically viable replacement for a lost arm. No matter how sophisticated a neural interface becomes, it can only deliver real benefit to patients if the connection between the patient and the prosthesis is safe and reliable in the long term. Our results are the product of many years of work, and now we can finally present the first bionic arm prosthesis that can be reliably controlled using implanted electrodes, while also conveying sensations to the user in everyday life," continues Max Ortiz Catalan.


The discovery to the memristor happened about 12 years ago and promised some significant new computational capabilities. The excitement then was overblown - but progress continues -  especially in combination with AI. This is a weak signal of the emergence of new computational capabilities.

A 3-D memristor-based circuit for brain-inspired computing

Researchers at the University of Massachusetts and the Air Force Research Laboratory Information Directorate have recently created a 3-D computing circuit that could be used to map and implement complex machine learning algorithms, such convolutional neural networks (CNNs). This 3-D circuit, presented in a paper published in Nature Electronics, comprises eight layers of memristors; electrical components that regulate the electrical current flowing in a circuit and directly implement neural network weights in hardware.


"Previously, we developed a very reliable memristive device that meets most requirements of in-memory computing for artificial neural networks, integrated the devices into large 2-D arrays and demonstrated a wide variety of machine intelligence applications," Prof. Qiangfei Xia, one of the researchers who carried out the study, told TechXplore. "In our recent study, we decided to extend it to the third dimension, exploring the benefit of a rich connectivity in a 3-D neural network."


Essentially, Prof. Xia and his team were able to experimentally demonstrate a 3-D computing circuit with eight memristor layers, which can all be engaged in computing processes. Their circuit differs greatly from other previously developed 3-D circuits, such as 3-D NAND flash, as these systems are usually comprised of layers with different functions (e.g. a sensor layer, a computing layer, a control layer, etc.) stacked or bonded together.


This is a great weak signal for the future of the ‘quantified self’, social science research (e.g. like the sociometric badge) and the return of Google glass.

FitByte uses sensors on eyeglasses to automatically monitor diet

A new wearable from researchers in Carnegie Mellon University's School of Computer Science helps wearers track their food habits with high fidelity.

FitByte, a noninvasive, wearable sensing system, combines the detection of sound, vibration and movement to increase accuracy and decrease false positives. It could help users reach their health goals by tracking behavioral patterns, and gives practitioners a tool to understand the relationship between diet and disease and to monitor the efficacy of treatment.


The device tracks all stages of food intake. It detects chewing, swallowing, hand-to-mouth gestures and visuals of intake, and can be attached to any pair of consumer eyeglasses. "The primary sensors on the device are accelerometers and gyroscopes, which are in almost every device at this point, like your phones and your watches.," said Mayank Goel, an assistant professor in the Institute for Software Research and the Human-Computer Interaction Institute.


An infrared proximity sensor detects hand-to-mouth gestures. To identify chewing, the system monitors jaw motion using four gyroscopes around the wearer's ears. The sensors look behind the ear to track the flexing of the temporal muscle as the user moves their jaw. High-speed accelerometers placed near the glasses' earpiece perceive throat vibrations during swallowing. This technology addresses the longstanding challenge of accurately detecting drinking, and the intake of soft things like yogurt and ice cream.


Speaking of the ‘quantified self’ here’s a great weak signal of the emerging digital environment augmented with virtual information. 

'Not just weeds': how rebel botanists are using graffiti to name forgotten flora

Pavement chalking to draw attention to wild flowers and plants in urban areas has gone viral across Europe – but UK chalkers could face legal action
Arising international force of rebel botanists armed with chalk has taken up street graffiti to highlight the names and importance of the diverse but downtrodden flora growing in the cracks of paths and walls in towns and cities across Europe.


The idea of naming wild plants wherever they go – which began in France – has gone viral, with people chalking and sharing their images on social media. More than 127,000 people have liked a photo of chalked-up tree names in a London suburb, while a video of botanist Boris Presseq of Toulouse Museum of Natural History chalking up names to highlight street flowers in the French city has had 7m views.


This is a good signal on a number of levels. One basic signal is that science is a deeply social collaborative effort. This is important to remember when we think the world is falling apart. This is also a signal of the ongoing work in profound basic science that will change this century. We have barely begun to see Big Data.
ALICE's 1,917 participants from 177 institutes and 40 nations are united in trying to better understand the nature of matter at extreme temperature and density.
Read's team delivered 3,276 circuit boards (plus 426 spares) for readout of the half a million TPC channels. The electronics upgrade makes it possible to digitize and distribute 5 million samples per second per channel.
"Non-stop data output totaling 3 terabytes per second will flow from the Time Projection Chamber, 24/7, during data taking," Read explained. "Historically, many experiments have dealt with megabyte per second, or even gigabyte per second, data rates. Real-time processing of streaming scientific data at 3 terabytes per second is approaching unique in the world. This is a big data problem of immense proportions."

Major upgrades of particle detectors and electronics prepare CERN experiment to stream a data tsunami

For a gargantuan nuclear physics experiment that will generate big data at unprecedented rates—called A Large Ion Collider Experiment, or ALICE—the University of Tennessee has worked with the Department of Energy's Oak Ridge National Laboratory to lead a group of U.S. nuclear physicists from a suite of institutions in the design, development, mass production and delivery of a significant upgrade of novel particle detectors and state-of-the art electronics, with parts built all over the world and now undergoing installation at CERN's Large Hadron Collider (LHC).


"This upgrade brings entirely new capabilities to the ALICE experiment," said Thomas M. Cormier, project director of the ALICE Barrel Tracking Upgrade (BTU), which includes an electronics overhaul that is among the biggest ever undertaken by DOE's Office of Nuclear Physics.