Less than 1% of the world’s population is involved in software development. This is unfortunate because the ideas of managing complexity are the same problems of human governance yet we are ignorant of most of these ideas.
This lack of understanding of software is also pervasive in other scientific fields. Most science is performed using concepts that existed before the invention of the computer. Many are unaware that our immersion with computers generates entirely new universal ideas.
Humanity is involved in many difficult complex governance problems (i.e. climate change, pandemic) where most people involved in these fields are unaware of the concepts and tools invented by software developers to tackle complexity.
Human civilization is critically dependent on humans to express complex ideas. Unfortunately, too many of us have never learned these newer vocabularies. When we are exposed to them, we interpret the expressions of the experts in the wrong ways.
The experience of India shows how consequential these agreements can be. In 1972 the nation banned product patents in pharmaceuticals. At the time, medicine prices in the country were among the highest in the world, but critics of the ban warned that the country would lose access to imported medicines. In the decades that followed, however, India established a vast indigenous generics manufacturing industry and reverse engineered most state-of-the-art medicines developed elsewhere. Prices in the country dropped to among the lowest in the world, and by the turn of the century, Indian generic companies had become the largest supplier of affordable essential medicines outside the western world and the largest global supplier of generic medicines. Doctors Without Borders dubbed the country the “pharmacy of the developing world.”
The success of this industry was not predictable from standard narratives of export-oriented growth. This was not a case of low wage led industrialization; India did not have a comparative advantage in the labor, knowhow or raw material required for drug production. Instead, a combination of industrial policy, including early public investment, learning by doing as Indian pharmaceutical companies gained technical and technological expertise, a fortuitously large pool of scientists, and critically, no IP restrictions on the adoption of foreign technology combined to allow the country to become a low-cost producer. In theory, countries specialize in the things they are best at making. In reality, what countries are good at depends on what they make—or are allowed to.
three important, overlapping arguments from across Ostrom’s scholarship to form a case for decentralisation and enhanced community power:
The commons: Communities can manage their own resources.
Beyond markets and states, there is a third model where communities establish their own systems without the need for regulation or privatisation. These communities can be found all over the world and are demonstrably capable of managing common resources and assets in a more sustainable and productive way than comparable state or market systems.
Self-governance: Democracy is more meaningful at a local level.
Legitimacy and social trust can only flourish when people have a reasonable expectation of influence over the things that affect their lives. Mobilised communities will tend to benefit from having decision making power and control over resources to develop local services and facilities.
Polycentricity: In complex social and environmental systems there are no one-size-fits-all solutions.
What is needed is a dynamic system that permits experimentation, and which can tolerate the existence of diverse and layered institutions of different kinds. The alternative – where top-down, monolithic systems dominate – diminishes resilience. Rather, it centralises risks and quashes creative, adaptive solutions to problems.
Three Core Conditions of Community Power
Locality: Systems should be designed for specific places.
Systems – including the way that resources are managed, rules are designed, and decisions are made – should be originated within, and appropriate for, the particular places where they operate. Ostrom’s evidence shows this makes it more likely that people will collaborate and cooperate with each other, and that overall outcomes can be improved this way.
Autonomy: The rights of communities to create and run local systems must be respected. Communities will have few incentives to come together without a basic expectation that their decisions and participation will have meaning and impact, and will that their decisions will be respected by external parties.
Diversity: Each community is different – and will take different approaches. Context-driven, autonomous communities will experiment with different systems. Taking different approaches in different places means people have a range of opportunities to get involved, enriching civil society. This diversity should be promoted, as it may reveal strong new approaches.
This is an excellent 5 min introduction of Modern Monetary Theory - the necessary economic paradigm if we are going to meet the challenges of climate change, aging infrastructure and the need for modern infrastructure as well as a social infrastructure that enable all people to flourish.
A MUST view.
Modern monetary theory, or MMT for short, is a superior framework for understanding how our monetary system functions today. It has been developed since the 1990s by Professor Bill Mitchell, alongside American academics like Professor Randall Wray, Professor Stephanie Kelton, and investment banker and fund manager Warren Mosler. MMT builds on the ideas of a previous generation of economists, such as Hyman Minsky, Wynne Godley and Abba Lerner.
Thanks to Dr. Steven Hail, Prof. Bill Mitchell, Warren Mosler, Patricia Pino & Christian Reily (at The MMT Podcast) for providing the inspiration and feedback for this video.
This is a positive signal of the future - one where we can enable a more participatory democracy to care for our social, economic, political and our ecological commons.
This report draws out Ostrom’s insights for the UK in the context of a growing crisis in the relationship between people and institutions. It adapts and contextualises her work into a new set of practical lessons for ‘self-governance’ – where communities take control over the things that matter to them – and connects these with contemporary examples of community-powered projects in the UK.
Elinor Ostrom humanised the study of economics and politics. She discovered what is possible, and the problems that can be solved, when we trust each other. Her work inspires optimism, but she was also a realist, basing her findings on decades of tireless work in the real world. This quietly revolutionary research led her to become the first woman to win a Nobel prize in economics. She demonstrated that people’s motivation and ability to cooperate, participate, and sustainably control their own resources are far greater than is usually assumed.
Ostrom’s work offers grounds for ambitiously re-imagining the relationship between people and institutions. It should inform and inspire policy debate about community power, devolution, public service reform, and organisational transformation.
This report draws out Ostrom’s insights for the UK in the context of a growing crisis in the relationship between people and institutions. It adapts and contextualises her work into a new set of practical lessons for ‘self-governance’ – where communities take control over the things that matter to them – and connects these with contemporary examples of community-powered projects in the UK.
A sign of the current state of privacy.
When Apple releases these “client-side scanning” functionalities, users of iCloud Photos, child users of iMessage, and anyone who talks to a minor through iMessage will have to carefully consider their privacy and security priorities in light of the changes, and possibly be unable to safely use what until this development is one of the preeminent encrypted messengers.
Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.
Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.
There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts—that is, accounts designated as owned by a minor—for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.
This is a good signal about something that has the whole globe aware and concerned.
If you think you don’t have viruses, think again.
It may be hard to fathom, but the human body is occupied by large collections of microorganisms, commonly referred to as our microbiome, that have evolved with us since the early days of man. Scientists have only recently begun to quantify the microbiome, and discovered it is inhabited by at least 38 trillion bacteria. More intriguing, perhaps, is that bacteria are not the most abundant microbes that live in and on our bodies. That award goes to viruses.
It has been estimated that there are over 380 trillion viruses inhabiting us, a community collectively known as the human virome. But these viruses are not the dangerous ones you commonly hear about, like those that cause the flu or the common cold, or more sinister infections like Ebola or dengue. Many of these viruses infect the bacteria that live inside you and are known as bacteriophages, or phages for short. The human body is a breeding ground for phages, and despite their abundance, we have very little insight into what all they or any of the other viruses in the body are doing.
And a nice account of some recent progress on our bacterial ecologies - this is worth the read.
Scientists are starting to work out how the gut microbiome can affect brain health. That might lead to better and easier treatments for brain diseases.
An important signal especially in the context of the massive forest fires of the last decade. It also signal the complex relationship that have to be understood in any natural phenomena.
"The main thing is that nobody has known whether planting trees at midlatitudes is good or bad because of the albedo problem," "We show that if one considers that clouds tend to form more frequently over forested areas, then planting trees over large areas is advantageous and should be done for climate purposes."
Planting trees and replenishing forests are among the simplest and most appealing natural climate solutions, but the impact of trees on atmospheric temperature is more complex than meets the eye.
One question among scientists is whether reforesting midlatitude locations such as North America or Europe could in fact make the planet hotter. Forests absorb large amounts of solar radiation as a result of having a low albedo, which is the measure of a surface's ability to reflect sunlight. In the tropics, low albedo is offset by the higher uptake of carbon dioxide by the dense, year-round vegetation. But in temperate climates, the concern is that the sun's trapped heat could counteract any cooling effect forests would provide by removing carbon dioxide from the atmosphere.
But a new study from Princeton University researchers found that these concerns may be overlooking a crucial component—clouds. They report in the Proceedings of the National Academy of Sciences that the denser cloud formations associated with forested areas means that reforestation would likely be more effective at cooling Earth's atmosphere than previously thought.
This is a sort of Godel type signal of some fundamental knowability in some aspects of mathematical and logical calculus-reasoning.
“There is a kind of worst-case hardness to it that is worth knowing about,” said Paul Goldberg of the University of Oxford, co-author of the work along with John Fearnley and Rahul Savani of the University of Liverpool and Alexandros Hollender of Oxford. The result received a Best Paper Award in June at the annual Symposium on Theory of Computing.
The most widely used technique for finding the largest or smallest values of a math function turns out to be a fundamentally difficult computational problem.
Many aspects of modern applied research rely on a crucial algorithm called gradient descent. This is a procedure generally used for finding the largest or smallest values of a particular mathematical function — a process known as optimizing the function. It can be used to calculate anything from the most profitable way to manufacture a product to the best way to assign shifts to workers.
Yet despite this widespread usefulness, researchers have never fully understood which situations the algorithm struggles with most. Now, new work explains it, establishing that gradient descent, at heart, tackles a fundamentally difficult computational problem. The new result places limits on the type of performance researchers can expect from the technique in particular applications.
Here is an interesting signal of the state of current technology applications of artificial intelligence - while trusting science is the best way to get reliable knowledge - trusting scientists is often not quite the same thing.
Analysis reveals that strange turns of phrase may indicate foul play in science.
In April 2021, a series of strange phrases in journal articles piqued the interest of a group of computer scientists. The researchers could not understand why researchers would use the terms ‘counterfeit consciousness’, ‘profound neural organization’ and ‘colossal information’ in place of the more widely recognized terms ‘artificial intelligence’, ‘deep neural network’ and ‘big data’.
Further investigation revealed that these strange terms — which they dub “tortured phrases” — are probably the result of automated translation or software that attempts to disguise plagiarism. And they seem to be rife in computer-science papers.
Research-integrity sleuths say that Cabanac and his colleagues have uncovered a new type of fabricated research paper, and that their work, posted in a preprint on arXiv on 12 July, might expose only the tip of the iceberg when it comes to the literature affected.
To get a sense of how many papers are affected, the researchers ran a search for several tortured phrases in journal articles indexed in the citation database Dimensions. They found more than 860 publications that included at least one of the phrases, 31 of which were published in a single journal: Microprocessors and Microsystems.
Progress in quantum computing continues - this signals a significant advance.
Quantum engineers from UNSW Sydney have removed a major obstacle that has stood in the way of quantum computers becoming a reality. They discovered a new technique they say will be capable of controlling millions of spin qubits—the basic units of information in a silicon quantum processor.
Until now, quantum computer engineers and scientists have worked with a proof-of-concept model of quantum processors by demonstrating the control of only a handful of qubits.
But with their latest research, published today in Science Advances, the team have found what they consider "the missing jigsaw piece" in the quantum computer architecture that should enable the control of the millions of qubits needed for extraordinarily complex calculations.
In this decade we are nudged to enact a Star Trek like relationship to a ubiquitous presence of computational support - “Computer - make it so” is morphing into “OK Google - Cortana - Alexis - etc.” This signals a more profound relationship.
Artificial intelligence research company OpenAI has announced the development of an AI system that translates natural language to programming code—called Codex, the system is being released as a free API, at least for the time being.
Codex is more of a next-step product for OpenAI, rather than something completely new. It builds on Copilot, a tool for use with Microsoft's GitHub code repository. With the earlier product, users would get suggestions similar to those seen in autocomplete in Google, except it would help finish lines of code. Codex has taken that concept a huge step forward by accepting sentences written in English and translating them into runnable code. As an example, a user could ask the system to create a web page with a certain name at the top and with four evenly sized panels below numbered one through four. Codex would then attempt to create the page by generating the code necessary for the creation of such a site in whatever language (JavaScript, Python, etc.) was deemed appropriate. The user could then send additional English commands to build the website piece by piece.
jeezuz -
our interfaces -
with the digital habitus -
are like the -
wall-o-rules in -
Animal Farm -
or a slow acid trip -
where everyday habits -
of perceptions -
change in ways that feel -
like we’re getting Alzheimer’s -
don’t know the motivation -
what the search is for ? -
to find or -
to get-away ? -
to be for healed -
or escaping a shadow ? -
life or -
unconsciousness -
what's the matter -
in hand -
No comments:
Post a Comment