Thursday, November 14, 2019

Friday Thinking 15 Nov 2019

Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.)  that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.

Many thanks to those who enjoy this.

In the 21st Century curiosity will SKILL the cat.

Jobs are dying - Work is just beginning.
Work that engages our whole self becomes play that works.
Techne = Knowledge-as-Know-How :: Technology = Embodied Know-How  
In the 21st century - the planet is the little school house in the galaxy.
Citizenship is the battlefield of the 21st  Century

“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9



Content
Quotes:


Articles:




In terms of population numbers, few species can compare to the success of humans. Though much attention on population size focuses on the past 200 years, humans were incredibly successful even before the industrial revolution, populating all of the world's environments with more than a billion people. Kramer uses her research on Maya agriculturalists of Mexico's Yucatan Peninsula and the Savanna Pumé hunter-gatherers of Venezuela to illustrate how cooperative childrearing increases the number of children that mothers can successfully raise and—in environments where beneficial—even speed up maturation and childbearing. Kramer argues that intergenerational cooperation, meaning that adults help support children, but children also share food and many other resources with their parents and other siblings, is at the center of humans' demographic success. "Together our diet and life history, coupled with an ability to cooperate, made us really good at getting food on the table, reproducing, and surviving," 

How human population came from our ability to cooperate



a certain picture of how vision works emerged, one in which signals from the eyes moved through passive sets of neurological filters that created increasingly specialized and complex representations of the environment. Only later in the process did that visual representation get integrated with information from other senses and other brain areas. “It’s tempting to see primary sensory areas as cameras that give an unadulterated view of what’s happening in the world,” said Anne Churchland, a neuroscientist at Cold Spring Harbor Laboratory in New York — but however elegant this model of vision might be, a mountain of evidence has proved it to be far too simplistic.


The brain’s immense interconnectivity fills it with feedback loops that let higher cortical areas talk to lower ones. Over decades of study, researchers have gradually found each region of the brain to be less specialized than labels might suggest: The visual cortex of people who are blind or visually impaired, for instance, can process auditory and tactile information. The somatosensory cortex, and not the motor cortex, was recently found to play a significant role in the learning of motor-based skills. And broader forces like attention, expectation or motivation can affect how people perceive.


Even in the dark, the neurons of the visual cortex continue to chatter.


Sensory information represents only a small part of what’s needed to truly perceive the environment. “You need to take into account movement, your body relative to the world, in order to figure out what’s actually out there,” 


“We used to think that the brain analyzed all these things separately and then somehow bound them together,” McCormick said. “Well, we’re starting to learn that the brain does that mixing of multisensory and movement binding [earlier] than we previously imagined.”


“Our brains aren’t just thinking in our heads. Our brains are interacting with our bodies and the way that we move through the world,” Niell said. “You think, ‘Oh, I’m just thinking,’ or ‘I’m just seeing.’ You don’t think about the fact that your body is playing a role in that.”

‘Noise’ in the Brain Encodes Surprisingly Important Signals



Sometime in June 2003,   Mel Karmazin, the president of Viacom, one of the largest media conglomerates in the world, walked into the Google offices in Mountain View, California. Google was a hip, young tech company that made money – actual money! – off the internet. Karmazin was there to find out how.


Larry Page and Eric Schmidt, Google’s founder and its CEO respectively, were already seated in the conference room when co-founder Sergey Brin came in, out of breath. He was wearing shorts. And roller skates.


The Google guys told Karmazin that the search engine’s earnings came from selling advertisements. Companies could buy paid links to websites that would appear at the top of users’ search results. And Google worked as a middleman, connecting websites with ad space to advertisers eager to get their banners seen.


Schmidt continued: "Our business is highly measurable. We know that if you spend X dollars on ads, you’ll get Y dollars in revenues." At Google, Schmidt maintained, you pay only for what works.


Karmazin was horrified. He was an old fashioned advertising man, and where he came from, a Super Bowl ad cost three million dollars. Why? Because that’s how much it cost. What does it yield? Who knows. 


"I’m selling $25bn of advertising a year," Karmazin said. "Why would I want anyone to know what works and what doesn’t?"
Leaning on the table, hands folded, he gazed at his hosts and told them: "You’re fucking with the magic." 


In the early 1990s, the internet sounded the death knell for that era of advertising. Today, we no longer live in the age of Mad Men, but of Math Men


The story that emerged from these conversations is about much more than just online advertising. It’s about a market of a quarter of a trillion dollars governed by irrationality. 


The benchmarks that advertising companies use – intended to measure the number of clicks, sales and downloads that occur after an ad is viewed – are fundamentally misleading. None of these benchmarks distinguish between the selection effect (clicks, purchases and downloads that are happening anyway) and the advertising effect (clicks, purchases and downloads that would not have happened without ads).


It gets worse: the brightest minds of this generation are creating algorithms which only increase the effects of selection. 


Advertising platforms are not the only ones susceptible to this flawed way of thinking. Advertisers make the same error. They’re targeting personalised ads at an audience that is already very likely to buy their product.


Marketers are often most successful at marketing their own marketing.

The new dot com bubble is here: it’s called online advertising




This is another awesome site for anyone interested in knowledge management and the original debt we all have to our inherited legacies of knowledge. All organization could create visuals of the knowledge networks - to illuminate the necessary interdependencies and transdisciplinary nature of all knowledge development.

Nature’s reach: narrow work has broad impact

A scientific paper today is inspired by more disciplines than ever before, shows a new analysis marking the journal’s 150th anniversary.
How knowledge informs and alters disciplines is itself an enlightening, and vibrant field. This type of meta research into new findings, insights, conceptual frameworks and techniques is important, among other things, for policymakers who fund research in the hope of tackling society’s most pressing challenges, which inevitably span disciplines.


Since its founding in 1869, Nature has offered a venue for publishing major advances from many fields. To mark its anniversary, we track here how papers cite and are cited across disciplines, using data on tens of millions of scientific articles indexed in Clarivate Analytics’ Web of Science (WoS), a bibliometric database that encompasses many thousands of research journals starting from 1900. We pay particular attention to articles that appeared in Nature. In our view, this snapshot, for all its idiosyncrasies, reveals how scientific work is ever more becoming a mixture of disciplines.


Several caveats are important. The volatility of our metrics in the early twentieth century can be attributed, at least in part, to the fact that articles then typically had many fewer references and citations. Until the mid-1920s, Nature articles typically listed no references; today, they can have up to 50. Another caveat is that the number of disciplines recognized by WoS grew from 57 in 1900 to 251 in 1993, but this is only one factor contributing to the disciplinary trends we found.


The Spectacular Interactive Graphic of the Web of Research Publication

Standing on the Shoulders of Giants



I think this is a strong signal of an inevitable development. Even if we outlaw or constrain this technology to very special cases - the rich will find ways to engage in genetic design tourism. A very serious question for humanity in the 21st Century - what will it mean to be human or how will humans evolve to meet the emerging challenges.
The company’s project remains at a preliminary stage. While some embryos have been tested by the company, Tellier, the CEO, says he is unsure if any have yet been used to initiate a pregnancy.

The world’s first Gattaca baby tests are finally here

The DNA test claims to let prospective parents weed out IVF embryos with a high risk of disease or low intelligence.
Anxious couples are approaching fertility doctors in the US with requests for a hotly debated new genetic test being called “23andMe, but on embryos.”


The baby-picking test is being offered by a New Jersey startup company, Genomic Prediction, whose plans we first reported on two years ago. 


The company says it can use DNA measurements to predict which embryos from an IVF procedure are least likely to end up with any of 11 different common diseases. In the next few weeks it's set to release case studies on its first clients. 


Handed report cards on a batch of frozen embryos, parents can use the test results to try to choose the healthiest ones. The grades include risk estimates for diabetes, heart attacks, and five types of cancer.

Not only are we domesticating DNA to enable the design of individuals, species, ecologies - but another strong signal points to the inevitable intensification of connection between all parts and all levels of assemblages of parts - the digital environment as a complex living system - with new emergent types of consciousness. It could also provide a component of secure identity - like a key that works with a pin-number-password and some personal biometric.
“It was like my body was online,” he said. “It was my very own Johnny Mnemonic moment.”

The rise of microchipping: are we ready for technology to get under the skin?

As implants grow more common, experts fear surveillance and exploitation of workers. Advocates say the concerns are irrational
On 1 August 2017, workers at Three Square Market, a Wisconsin-based company specializing in vending machines, lined up in the office cafeteria to be implanted with microchips. One after the other, they held out a hand to a local tattoo artist who pushed a rice-grain sized implant into the flesh between the thumb and forefinger. The 41 employees who opted into the procedure received complimentary t-shirts that read “I Got Chipped”.


This wholesale implant event, organized by company management, dovetailed with Three Square Market’s longer-term vision of a cashless payment system for their vending machines – workplace snacks purchased with a flick of the wrist. And the televised “chipping party” proved to be a savvy marketing tactic, the story picked up by media outlets from Moscow to Sydney.


But not all of the attention was positive. After the event, comments on Three Square Market’s Facebook page urged employees to quit. The company’s Google reviews page was inundated with one-star ratings. And Christian groups – convinced that the implants fulfilled an end-of-days prophecy where people are branded with “the mark of the beast” – accused the company of being the antichrist.

Another extension of the human sensorium to ever smaller domains.
 in these studies, we've shown that when 4-D-STEM is deployed with our high-speed detectors, customizable algorithms, and powerful electron microscopes, the technique can help scientists map out atomic or molecular regions in any material—even beam-sensitive, soft materials—that weren't possible to see with previous techniques

World-leading microscopes take candid snapshots of atoms in their 'neighborhoods'

We can directly see the hidden world of atoms thanks to electron microscopes, first developed in the 1930s. Today, electron microscopes, which use beams of electrons to illuminate and magnify a sample, have become even more sophisticated, allowing scientists to take real-world snapshots of materials with a resolution of less than half the diameter of a hydrogen atom.


Now, scientists at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) are pushing the boundaries of electron microscopy even further through a powerful technique called 4-D-STEM, a term that stands for "2-D raster of 2-D diffraction patterns using scanning transmission electron microscopy."


Their findings, reported in Nature Communications and Nature Materials, show for the first time how 4-D-STEM can provide direct insight into the performance of any material—from strong metallic glass to flexible semiconducting films—by pinpointing specific atomic "neighborhoods" that could compromise a material's performance, or perhaps have the potential to improve it.

A combination of materials and computation - a weak signal of computational advances.
In a completely new approach, AMOLF Ph.D. student Andrea Cordaro and his co-workers created a special "metasurface," a transparent substrate with a specially designed array of silicon nanobars. When an image is projected onto the metasurface, the transmitted light forms a new image that shows the edges of the original. Effectively, the metasurface performs a mathematical derivative operation on the image, which provides a direct probe of edges in the image. 

Mathematics at the speed of light

AMOLF researchers and their collaborators from the Advanced Science Research Center (ASRC/CUNY) in New York have created a nanostructured surface capable of performing on-the-fly mathematical operations on an input image. This discovery could boost the speed of existing imaging processing techniques and lower energy usage. The work enables ultrafast object detection and augmented reality applications. The researchers publish their results today in the journal Nano Letters.


Image processing is at the core of several rapidly growing technologies, such as augmented reality, autonomous driving and more general object recognition. But how does a computer find and recognize an object? The initial step is to understand where its boundaries are, hence edge detection in an image becomes the starting point for image recognition. Edge detection is typically performed digitally using integrated electronic circuits implying fundamental speed limitations and high energy consumption, or in an analog fashion which requires bulky optics.

Signalling a future of computation that endorses an inevitable power of open-source approaches - this is a signal to watch.
"In our datacenters, we protect the boot process with secure boot. Our machines boot a known firmware/software stack, cryptographically verify this stack and then gain (or fail to gain) access to resources on our network based on the status of that verification. Titan integrates with this process and offers additional layers of protection."
"The aim of the new coalition is to build trustworthy chip designs for use in data centers, storage and computer peripherals, which are both open and transparent, allowing anyone to inspect the hardware for security vulnerabilities and backdoors."

OpenTitan for data centers: Google, partners push secure silicon design

The Google Security Blog on Tuesday announced OpenTitan as an open source chip design, where other organizations have joined Google in an effort to further rise the bar on security surrounding the original Titan chip.


Titan is Google's custom root of trust (RoT) chip. Google in the past has described Titan as "a secure, low-power microcontroller designed with Google hardware security requirements and scenarios in mind."


Titan helps ensure that machines in Google's data centers boot from a known trustworthy state with verified code.

Marshall McLuhan noted before anyone else that the earth is now within a man-made environment (with the successful launch of Sputnik). 5G is on the horizon and in the next few decades it is entirely plausible that our environments will be filled with sensors. A great Science Fiction novel by the Canadian Karl Schroeder called ‘Stealing Worlds explores some of the possibilities of our world filled with smart and not so smart sensors. 
This is a good signal or this sort of inevitable development.
The low cost of the GMpi system (about US$200), made possible by open source software and inexpensive hardware, is also likely to lure researchers looking for a way to ensure their growth chamber experiments are running smoothly. "Many researchers do not have the funding or resources to afford expensive monitoring systems, but still would like to know the conditions their plants experience day-to-day," 

Introducing GMpi: Affordable and adaptable remote monitoring for plant growth experiments

Growth chambers are a cornerstone of laboratory-based plant science, allowing for the tightly controlled conditions necessary for many experimental designs. However, these conditions can sometimes be a little less than controlled, creating headaches ranging from reproducibility issues to the loss of entire experiments. Remote monitoring of conditions helps, but the equipment can be expensive, or lack features or sensors important for a particular experiment. In research presented in a recent issue of Applications in Plant Sciences, Makenzie Mabry, MS, and colleagues at the University of Missouri and University of Arizona developed a flexible and inexpensive monitoring system for plant growth facilities, called Growth Monitor pi, or GMpi. The system uses open source software and a single-board Raspberry Pi computer, and can be connected to a wide variety of different sensors to meet researchers' specific needs.


Necessity is often the mother of invention, and that was the case for GMpi. "We wanted to be able to monitor some sensitive experiments but were traveling a lot at the time and couldn't find anything that completely met our needs with notifications and alerts at an affordable price," said Mabry, the corresponding author of the manuscript. "We hope that the GMpi is approachable for other plant scientists who wish to monitor their plants more closely, have extra security in alerting users to conditions in plant growth facilities, or just wish to increase reproducibility across studies."


The GMpi system's "internet of things" approach maximizes flexibility while keeping costs low, making it an appealing tool to a variety of plant researchers. "We hope that researchers can take the GMpi and expand on it in ways that suit their research," said Mabry. "For example, we think that those interested in phenotyping their plants can adapt the GMpi relatively easily for that purpose, as well as retaining the use of it for monitoring their plants."

A weak but important signal of one possible future trajectory of Domesticating DNA.
"It's the most complicated genetic, cellular engineering that's been attempted so far," said the study leader, Dr. Edward Stadtmauer of the University of Pennsylvania in Philadelphia. "This is proof that we can safely do gene editing of these cells."

Doctors try CRISPR gene editing for cancer, a 1st in the US

The first attempt in the United States to use a gene editing tool called CRISPR against cancer seems safe in the three patients who have had it so far, but it's too soon to know if it will improve survival, doctors reported Wednesday.


The doctors were able to take immune system cells from the patients' blood and alter them genetically to help them recognize and fight cancer, with minimal and manageable side effects.


The treatment deletes three genes that might have been hindering these cells' ability to attack the disease, and adds a new, fourth feature to help them do the job.

This is a great signal for why we must continue our domestication of DNA.
The discovery shows that other crop-destroying rust strains could hybridise in other parts of the world, and scientists found evidence of this in their study.
It also means Ug99 could once again exchange genetic material with different pathogen strains to create a whole new enemy.
"There is some good news, however, as the more you know your enemy, the more equipped you are to fight against it.
"Knowing how these pathogens come about means we can better predict how they are likely to change in the future and better determine which resistance genes can be bred into wheat varieties to give long-lasting protection."

Cereal killer's deadly touch could lead to new wheat threat

Scientists have uncovered the origins of the world's deadliest strain of cereal rust disease which threatens global food security.


Researchers from Australia's national science agency, CSIRO, together with partners in the US and South Africa have solved a 20-year-old mystery with findings published today in Nature Communications.


Their works shows that the devastating Ug99 strain of the wheat stem rust fungus (named for its discovery and naming in Uganda in 1999) was created when different rust strains simply fused to create a new hybrid strain.


This process is called somatic hybridisation and enables the fungi to merge their cells together and exchange genetic material without going through the complex sexual reproduction cycle.

And another weak but important signal

A game-changing test for Prion, Alzheimer's and Parkinson's diseases is on the horizon

There are currently no effective treatments for prion diseases, a family of fatal neurodegenerative conditions caused by accumulations of misfolded copies of a naturally occurring protein. But now, there is finally an effective way to test for them.


As reported in the journal PLOS ONE, a team of scientists who have been working on prion detection for nearly 20 years have demonstrated that their unique, synthetic-molecule-based approach can isolate prion proteins in body fluids sampled from infected animals. This finding—which confirms that their test is the only published testing method capable of quick, noninvasive prion detection in living subjects—is a momentous milestone in the evolution of a biomedical technology with far-ranging applications.


"Our peptoid beads have the ability to detect the misfolded proteins that act as infectious agents, so it could have a significant impact in the realm of prion diseases, but we have also shown that it can seek out the large aggregated proteins that are the disease agents in Alzheimer's and Parkinson's diseases, among others" said Ronald Zuckermann, an early pioneer of peptoids and one of the research team's founding members. Zuckermann is now a senior scientist at Lawrence Berkeley National Laboratory (Berkeley Lab)'s Molecular Foundry."Prion diseases are rare, but there are many misfolded protein-based diseases, which affect millions of people, that are also very poorly understood. And like prion diseases, we need a way to diagnose these slow-onset conditions in the years before symptoms arise."

A wonderful signal of the future of energy and the transformation of energy geopolitics.
The most advanced potential commercial use the team developed is a transparent coating that can be applied to home windows, a moving vehicle, or even clothing. The coating collects solar energy and releases heat, reducing electricity required for heating spaces and curbing carbon emissions. Moth-Poulsen is coating an entire building on campus to showcase the technology. The ideal use in the early going, he says, is in relatively small spaces. “This could be heating of electrical vehicles or in houses.”

An Energy Breakthrough Could Store Solar Power for Decades

 Researchers in Sweden have created a molecule that offers a way to trap heat from the sun.
For decades, scientists have sought an affordable and effective way of capturing, storing, and releasing solar energy. Researchers in Sweden say they have a solution that would allow the power of the sun’s rays to be used across a range of consumer applications—heating everything from homes to vehicles.


Scientists at Chalmers University of Technology in Gothenburg have figured out how to harness the energy and keep it in reserve so it can be released on demand in the form of heat—even decades after it was captured. The innovations include an energy-trapping molecule, a storage system that promises to outperform traditional batteries, at least when it comes to heating, and an energy-storing laminate coating that can be applied to windows and textiles. 


The breakthroughs, from a team led by researcher Kasper Moth-Poulsen, have garnered praise within the scientific community. Now comes the real test: whether Moth-Poulsen can get investors to back his technology and take it to market.


The system starts with a liquid molecule made up of carbon, hydrogen, and nitrogen. When hit by sunlight, the molecule draws in the sun’s energy and holds it until a catalyst triggers its release as heat. The researchers spent almost a decade and $2.5 million to create a specialized storage unit, which Moth-Poulsen, a 40-year-old professor in the department of chemistry and chemical engineering, says has the stability to outlast the 5-to 10-year life span of typical lithium-ion batteries on the market today.

A couple of nice signals in this article. One that there are many forms of inexpensive, reliable and efficient ways to make a battery and another is inspired by the visual in the article. Parking lots can make great solar farms - if the solar panels are high enough for cars to park underneath - the other benefit is that cars are less exposed to wear of heating in the sun.
“Air conditioning accounts for 40 percent of our daily energy usage, so by eliminating this we are taking a major step towards our carbon neutral goal,”.
“The system was switched on in September and is now delivering 2.1 megawatts of power and we estimate that we will save more than AU$100 million (US$69 million) in energy costs over the next 25 years,”
the system is expected to prevent more than 92,000 tonnes of CO2 emissions over the coming 25 years, the equivalent output of 525 typical Australian homes in the same timeframe, according to USC.

Three-story water battery cuts university's energy usage by 40 percent

The University of the Sunshine Coast (USC) in Queensland, Australia, is on a mission to become completely carbon neutral by 2025, and a huge early addition to its energy systems is boding well for these lofty ambitions. Switched on in September, a new three-story “water battery” is already producing enough juice to power the campus’ air conditioning systems, reducing its reliance on the grid by more than 40 percent.


In pursuit of its climate-neutral goals, USC teamed up with private company Veolia to draw up a new clean energy solution for its buildings. Looking to make the most of the region’s abundant sunshine and take a bite out of the grid energy used for air conditioning, which accounts for 40 percent of its overall usage, the two came up with solution they’ve dubbed the “water battery.”


It is in essence a huge thermal energy storage system. It makes use of 6,000 solar panels installed on the campus’ rooftops and carparks that make up a 2.1-megawatt photovoltaic system. The energy generated by this solar system is then used to cool 4.5 megaliters of water resting inside a three-story tank. This cooled water is then used for the campus’ air conditioning systems, and to great effect.

I am an unabashed coffee aficionado (a great coffee lover) and therefore can’t resist but to include this signal.

Researchers discover coffee drinkers could halve their risk of liver cancer

A research team from Queen's University has found that coffee drinkers have a lower risk of the most common type of liver cancer, hepatocellular carcinoma (HCC).
The results have been presented at the National Cancer Research Institute (NCRI) conference in Glasgow this week and was published in British Journal of Cancer earlier this year.


Coffee is one of the most commonly consumed beverages worldwide. Previous research has shown there are many health benefits of drinking coffee, which may be due to its high levels of antioxidants.


The study took place in the UK over 7.5 years and looked at the coffee-drinking habits of 471,779 participants in the UK Biobank, one of the largest studies of middle-aged individuals in the world.


The research team's overall findings suggested a reduced risk of hepatocellular carcinoma, the most common form of liver cancer, in coffee drinkers compared to those who did not drink coffee.

No comments:

Post a Comment