Friends are a treasure. In an uncertain world, they provide a comforting sense of stability and connection. We laugh together and cry together, sharing our good times and supporting each other through the bad. Yet a defining feature of friendship is that it’s voluntary. We’re not wedded together by law, or through blood, or via monthly payments into our bank accounts. It is a relationship of great freedom, one that we retain only because we want to.
But the downside of all this freedom, this lack of formal commitment, is that friendship often falls by the wayside. Our adult lives can become a monsoon of obligations, from children, to partners, to ailing parents, to work hours that trespass on our free time. A study of young adults’ social networks by researchers at the University of Oxford found that those in a romantic relationship had, on average, two fewer close social ties, including friends. Those with kids had lost out even more. Friendships crumble, not because of any deliberate decision to let them go, but because we have other priorities, ones that aren’t quite as voluntary. The title of the Oxford paper summed up things well: ‘Romance and Reproduction Are Socially Costly’.
As law professor Tim Wu noted in his book The Master Switch, new media tend to start out in a Wild West, then clean up, put on a suit, and consolidate in a cautious center. Radio, for example, began as a chaos of small operators proud to say anything, then gradually coagulated into a small number of mammoth networks aimed mostly at pleasing the mainstream.
This seems to me to identify the major internal weakness of the “normcore” approach to analysis that Levitsky and Ziblatt have become associated with. This approach tends to treat norms as worth respecting in and of themselves, on the argument that such norms are what prevent politics from breaking down entirely. This is not an obviously wrong argument, especially in a polity like the U.S., where a two centuries old constitution has been jury-rigged by norms into something that might, just about, manage a modern polity without sinking.
But the problem is that norms are institutions (more precisely, they are informal institutions that are not supported by formal external punishments but by the expectations of the actors that adhere to them) and institutions do not exist in a vacuum. In game theoretic terms, norm maintenance depends on actors’ expectations about “what is off the equilibrium path.” In more practical language, norm maintenance requires not just that political actors worry about the chaos that will ensue if the norms stop working. It also relies on the fear of punishment – that if one side deviates from the political bargain implicit in the norm, the other side will retaliate, likely by breaking the norm in future situations in ways that are to their own particular advantage.
What this means, pretty straightforwardly, is that norms don’t just rely on the willingness of the relevant actors to adhere to them. They also rely on the willingness of actors to violate them under the right circumstances. If one side violates, then the other side has to be prepared to punish. If one side threatens a violation, then the other side has to threaten in turn, to make it clear that deviating from the norm will be costly. A norm governing relations between two opposing sides, where one side acts strategically (to exploit opportunities) and the other naively (always to support the norm) can’t be sustained.
You don’t have to analyze individual water molecules to understand the behavior of droplets, or droplets to study a wave. This ability to shift focus across various scales is the essence of renormalization.
As cognitive science increasingly reveals, our thinking doesn’t run on a single track, like a serial computer, but seems to be organised into a variety of facilities, or modes of thought, that loosely communicate with each other. The jagged nature of the interaction might be responsible for the sense of fissure within the mind, reported by many writers and thinkers. Language is just one mode of thought, with its own characteristic parameters and limitations. Though it uniquely affords us with a distanced perspective on our thoughts, it is only an imperfect instrument for capturing them. There are other modes that can present us with aspects of reality and interface more directly with our emotions but are less amenable to explicit reasoning and articulation. Only an uncooperative (and mean-spirited) interlocutor would regard our difficulties in articulation as a sign that we lack anything meaningful to say.
This is another signal confirming the 60 is the new 50.
“This research is unique because there are only a few studies in the world that have compared performance-based maximum measures between people of the same age in different historical times,”
“The results suggest that our understanding of older age is old-fashioned. From an aging researcher’s point of view, more years are added to midlife, and not so much to the utmost end of life. Increased life expectancy provides us with more non-disabled years, but at the same time, the last years of life comes at higher and higher ages, increasing the need for care. Among the ageing population, two simultaneous changes are happening: continuation of healthy years to higher ages and an increased number of very old people who need external care.”
The functional ability of older people is nowadays better when it is compared to that of people at the same age three decades ago. This was observed in a study conducted at the Faculty of Sport and Health Sciences at the University of Jyväskylä, Finland. The study compared the physical and cognitive performance of people nowadays between the ages of 75 and 80 with that of the same-aged people in the 1990s.
Among men and women between the ages of 75 and 80, muscle strength, walking speed, reaction speed, verbal fluency, reasoning and working memory are nowadays significantly better than they were in people at the same age born earlier. In lung function tests, however, differences between cohorts were not observed.
“Higher physical activity and increased body size explained the better walking speed and muscle strength among the later-born cohort,” says doctoral student Kaisa Koivunen, “whereas the most important underlying factor behind the cohort differences in cognitive performance was longer education.”
Here’s a very interesting signal - about the longer term impact of playing video games - I wonder what they could do for the aging boomers.
"People who were avid gamers before adolescence, despite no longer playing, performed better with the working memory tasks, which require mentally holding and manipulating information to get a result,"
A number of studies have shown how playing video games can lead to structural changes in the brain, including increasing the size of some regions, or to functional changes, such as activating the areas responsible for attention or visual-spatial skills. New research from the Universitat Oberta de Catalunya (UOC) has gone further to show how cognitive changes can take place even years after people stop playing.
This is one of the conclusions from the article published in Frontiers in Human Neuroscience. The study involved 27 people between the ages of 18 and 40 with and without any kind of experience with video gaming.
The results show that people without experience of playing video games as a child did not benefit from improvements in processing and inhibiting irrelevant stimuli. Indeed, they were slower than those who had played games as children, which matched what had been seen in earlier studies.
Likewise, "people who played regularly as children performed better from the outset in processing 3-D objects, although these differences were mitigated after the period of training in video gaming, when both groups showed similar levels," said Palaus.
I think this is lovely - incorporating a ‘belief function’ into a mathematical approach to uncertainty - anyone feeling twinges of cognitive dissonance?
How should people make decisions when the outcomes of their choices are uncertain, and the uncertainty is described by probability theory?
That's the question faced by Prakash Shenoy, the Ronald G. Harper Distinguished Professor of Artificial Intelligence at the University of Kansas School of Business.
His answer can be found in the article "An Interval-Valued Utility Theory for Decision Making with Dempster-Shafer Belief Functions," which appears in the September issue of the International Journal of Approximate Reasoning.
"People assume that you can always attach probabilities to uncertain events," Shenoy said.
"But in real life, you never know what the probabilities are. You don't know if it's 50 percent or 60 percent. This is the essence of the theory of belief functions that Arthur Dempster and Glenn Shafer formulated in the 1970s."
His article (co-written with Thierry Denoeux) generalizes the theory of decision-making from probability to belief functions. "Probability decision theory is used for making any sort of high-stakes choice. Like should I accept a new job or a marriage proposal? Something high stakes. You wouldn't need it for where to go for lunch," he said.
This may seem a bit arcane - but really is simply grasping exponential increases in computational capabilities for the same costs - The capabilities vs cost of a computing device in 1960 vs the the same cost can deliver now in 2020 - with a changing paradigm the material manifestation of a transistor - could be a single molecule in 2060?Sooner? - what will devices be?
We rarely think about the technology that lies behind turning on a light bulb or our use of electrical appliances. The control of charged particles on a minute scale is simply part of everyday life.
But on a much smaller nanoscale, scientists are now routinely able to manipulate the flow of electrons. This opens up possibilities for even smaller components in computers and mobile phones that use barely any electricity.
Researchers at the Norwegian University of Science and Technology (NTNU) have found a completely new method to check the electronic properties of oxide materials. This opens the door to even tinier components and perhaps more sustainable electronics.
"We found a completely new way to control the conductivity of materials at the nanoscale," says Professor Dennis Meier at NTNU's Department of Materials Science and Engineering.
One of the best aspects of the new method is that it does not interfere with other properties of the material, like previous methods did. This makes it possible to combine different functions in the same material, which is an important advance for nanoscale technology.
A new article in the journal Nature Materials addresses the findings. The article has attracted international attention even before being printed.
In my hippie youth when I owned a chevy van - it was an old one - but I was handy enough with it - that once during midwinter in Ottawa - I had to stop on the side of the street - lift the engine hood - take of the carburetor and take apart enough to clean - re-install - start the engine and continue - I wonder if the new electric engines promise that level of use-interaction?
The 30 sec video is worth the view
General Motors on Wednesday announced plans for the production of a family of electric motors and drive units for its next generation of electric cars and trucks.
It will design and manufacture five interchangeable electric powertrains and three electric motors under the name Ultium Drive. The electric drive systems will be used across a spectrum of vehicles, from passenger cars to pickup trucks to high performance autos.
As it transitions to a complete electric lineup, GM vehicles will have better integration between the engine and electrical system and the car's other components and achieve greater efficiencies with Ultium Drive.
The new electric drive systems, also referred to as e-drives, combine gear, motor and power electronics into a single system that will more efficiently convert energy to drive the vehicle. By building the power electronics into the drive assemblies, greater power is attained in roughly half the space. And the system is lighter.
An interesting signal of the emergence of the domestication of bacteria - for manufacturing.
Rare earth elements are vital for many modern technologies. Chemists at LMU have now shown that a cofactor found in a bacterial enzyme can selectively extract some of these metals from mixtures in an environmentally benign fashion.
Rare earth elements (REEs) are an indispensable ingredient of the electronic devices that are now an integral part of our daily lives. They are employed in computers, smartphones, electric motors and many other key technologies as components of magnets and batteries, and also serve as powerful chemical catalysts. REEs comprise 17 elements—scandium, yttrium, lanthanum and the 14 lanthanides that follow lanthanum in the Periodic Table. In nature, they occur as mixtures and are often found in association with the radioactive elements uranium and thorium. All REEs exhibit very similar chemical properties, which makes separating them from each other a difficult, energy-intensive and environmentally problematic task. Now a team led by LMU chemist Professor Lena Daumann has shown that an enzyme cofactor called pyrroloquinoline quinone (PQQ) found in certain species of bacteria selectively binds to specific REEs and can be used to separate them from mixtures.
That REEs also play essential roles in the biosphere was discovered less than 10 years ago, when it was shown that certain types of bacteria can selectively take up lanthanides from the environment, which are then incorporated into enzymes for use as metabolic catalysts.
This is definitely a strong signal - time to listen carefully.
Coal is now a loser around the world.
in the US and across the world, coal power is dying. By 2030, it will be uneconomic to run existing coal plants. That means all the dozens of coal plants on the drawing board today are doomed to become stranded assets.
1. It is already cheaper to build new renewables than to build new coal plants, in all major markets.
2. Over half the existing global coal fleet is more expensive to run than building new renewables.
3. By 2030, it will be cheaper to build new renewables than to run existing coal — everywhere.
4. Investors stand to lose over $600 billion on doomed coal plants.
This is another small signal in the transformation of energy and transportation.
European planemaker Airbus SE unveiled three designs it’s studying to build hydrogen-powered aircraft as it races to bring a zero-carbon passenger plane into service by 2035.
The approaches include a turbofan jet with capacity for as many as 200 passengers -- similar to its A321neo narrow-body -- that can fly more than 2,000 nautical miles, according to a statement Monday. It would be powered by a modified gas-turbine engine running on hydrogen.
The manufacturer also showed a design for a propeller plane which would seat about 100 passengers for smaller distances, and a flying-wing concept with 200 seats.
The company is under pressure from the French and German governments, its biggest shareholders, to speed development of new aircraft after aiding the planemaker during the coronavirus crisis. Together, the two countries have committed some 2.5 billion euros ($2.9 billion) toward cleaner propulsion.
Another small signal of the progress to make drinkable water more widely available cheaply.
Membrane separations have become critical to human existence, with no better example than water purification. As water scarcity becomes more common and communities start running out of cheap available water, they need to supplement their supplies with desalinated water from seawater and brackish water sources.
Lawrence Livermore National Laboratory (LLNL) researchers have created carbon nanotube (CNT) pores that are so efficient at removing salt from water that they are comparable to commercial desalination membranes. These tiny pores are just 0.8 nanometers (nm) in diameter. In comparison, a human hair is 60,000 nm across. The research appears on the cover of the Sept. 18 issue of the journal Science Advances.
Biological water channels, also known as aquaporins, provide a blueprint for the structures that could offer increased performance. They have an extremely narrow inner pore that squeezes water down to a single-file configuration that enables extremely high water permeability, with transport rates exceeding 1 billion water molecules per second through each pore.
So just because I love coffee - to the point I roast my own beans.
The investigators found that in 1,171 patients treated for metastatic colorectal cancer, those who reported drinking two to three cups of coffee a day were likely to live longer overall, and had a longer time before their disease worsened, than those who didn't drink coffee. Participants who drank larger amounts of coffee—more than four cups a day—had an even greater benefit in these measures. The benefits held for both caffeinated and decaffeinated coffee.
The findings enabled investigators to establish an association, but not a cause-and-effect relationship, between coffee drinking and reduced risk of cancer progression and death among study participants. As a result, the study doesn't provide sufficient grounds for recommending, at this point, that people with advanced or metastatic colorectal cancer start drinking coffee on a daily basis or increase their consumption of the drink, researchers say.
No comments:
Post a Comment