Mar 29 2024

Is Music Getting Simpler

I don’t think I know anyone personally who doesn’t have strong opinions about music – which genres they like, and how the quality of music may have changed over time. My own sense is that music as a cultural phenomenon is incredibly complex, no one (in my social group) really understands it, and our opinions are overwhelmed by subjectivity. But I am fascinated by it, and often intrigued by scientific studies that try to quantify our collective cultural experience. And I know there are true experts in this topic, musicologists and even ethnomusicologists, but haven’t found good resources for science communication in this area (please leave any recommendations in the comments).

In any case, here are some random bits of music culture science that I find interesting. A recent study analyzing 12,000 English language songs over the last 40 years has found that songs have been getting simpler and more repetitive over time. They are using fewer words with greater repetition. Further, the structure of the lyrics are getting simpler, and they are more readable and easier to understand. Also, the use of emotional words has increased, and has become overall more negative and more personal. I have to note this is a single study and there are some concerns about the software used in the analysis, but while this is being investigated the authors state that it is unlikely any glitch will alter their basic findings.

But taken at face value, it’s interesting that these findings generally fit with my subjective experience. This doesn’t necessarily make me more confident in the findings, and I do worry that I am just viewing these results through my confirmation bias filter. Still, it not only fits what I have perceived in music but in culture in general, especially with social media. We should be wary of simplistic explanations, but I wonder if this is mainly due to a general competition for attention. Overtime there is a selective pressure for media that is more immediate, more emotional, and easier to consume. The authors also speculate that it may reflect our changing habits in terms of consuming media. There is a greater tendency to listen to music, for example, in the background, while doing other things (perhaps several other things).

Continue Reading »

Comments: 0

Mar 26 2024

The Experience Machine Thought Experiment

In 1974 Robert Nozick published the book, Anarchy, State, and Utopia, in which he posed the following thought experiment: If you could be plugged into an “experience machine” (what we would likely call today a virtual reality or “Matrix”) that could perfectly replicate real-life experiences, but was 100% fake, would you do it? The question was whether you would do this irreversibly for the rest of your life. What if, in this virtual reality, you could live an amazing life – perfect health and fitness, wealth and resources, and unlimited opportunity for adventure and fun?

Nozick hypothesized that people generally would not elect to do this (as summarized in a recent BBC article). He gave three reasons – we want to actual do certain things, and not just have the experience of doing them, we want to be a certain kind of person and that can only happen in reality, and we want meaning and purpose in our lives, which is only possible in reality.

A lot has happened in the last 50 years and it is interesting to revisit Nozick’s thought experiment. I would say I basically disagree with Nozick, but there is a lot of nuance that needs to be explored. For me there are two critical variables, only one of which I believe was explicitly addressed by Nozick. In his thought experience once you go into the experience machine you have no memory of doing so, therefore you would believe the virtual reality to be real. I would not want to do this. So in that sense I agree with him – but he did not give this as a major reason people would reject the choice. I would be much more likely to go into a virtual reality if I retained knowledge of the real world and that I was in a virtual world.

Continue Reading »

Comments: 0

Mar 25 2024

Man Gets Pig Kidney Transplant

On March 16 surgeons transplanted a kidney taken from a pig into a human recipient, Rick Slayman. So far the transplant is a success, but of course the real test will be how well the kidney functions and for how long. This is the first time such a transplant has been done into a living donor – previous experimental pig transplants were done on brain dead patients.

This approach to essentially “growing organs” for transplant into humans, in my opinion, has the most potential. There are currently over 100 thousand people on the US transplant waiting list, and many of them will die while waiting. There are not enough organs to go around. If we could somehow manufacture organs, especially ones that have a low risk of immune rejection, that would be a huge medical breakthrough. Currently there are several options.

One is to essentially construct a new organ. Attempts are already underway to 3D print organs from stem cells, which can be taken from the intended recipient. This requires a “scaffold” which is connective tissue taken from an organ where the cells have been stripped off. So you still need, for example, a donor heart. You then strip that heart of cells, 3D print new heart cells onto what’s left to create a new heart. This is tricky technology, and I am not confident it will even work.

Another option is to grow the organs ex-vivo – grow them in a tank of some kind from stem cells taken from the intended recipient. The advantage here is that the organ can potentially be a perfect new organ, entirely human, and with the genetics of the recipient, so no issues with rejection. The main limitation is that it takes time. Considering, however, that people often spend years on the transplant wait list, this could still be an option for some. The problem here is that we don’t currently have the technology to do this.

Continue Reading »

Comments: 0

Mar 21 2024

Using CRISPR To Treat HIV

CRISPR has been big scientific news since it was introduced in 2012. The science actually goes back to 1987, but the CRISPR/Cas9 system was patented in 2012, and the developers won the Noble Prize in Chemistry in 2020. The system gives researchers the ability to quickly and cheaply make changes to DNA, by seeking out and matching a desired sequence and then making a cut in the DNA at that location. This can be done to inactivate a specific gene or, using the cells own repair machinery, to insert a gene at that location. This is a massive boon to genetics research but is also a powerful tool of genetic engineering.

There is also the potential for CRISPR to be used as a direct therapy in medicine. In 2023 the first regulatory approval for CRISPR as a treatment for a disease was given to treatments for sickle cell disease and thalassemia. These diseases were targeted for a technical reason – you can take bone marrow out of a patient, use CRISPR to alter the genes for hemoglobin, and then put it back in. What’s really tricky about using CRISPR as a medical treatment is not necessarily the genetic change itself, but getting the CRISPR to the correct cells in the body. This requires a vector, and is the most challenging part of using CRISPR as a medical intervention. But if you can bring the cells to the CRISPR that eliminates the problem.

Continue Reading »

Comments: 0

Mar 18 2024

Energy Demand Increasing

For the last two decades electricity demand in the US has been fairly flat. While it has been increasing overall, the increase has been very low. This has been largely attributed to the fact that as the use of electrical devices has increased, the efficiency of those devices has also increased. The introduction of LED bulbs, increased building insulation, more energy efficient appliances has largely offset increased demand. However, the most recent reports show that US energy demand is turning up, and there is real fear that this recent spike is not a short term anomaly but the beginning of a long term trend. For example, the projection of increase in energy demand by 2028 has nearly doubled from the 2022 estimate to the 2023 estimate – ” from 2.6% to 4.7% growth over the next five years.”

First, I have to state my usual skeptical caveat – these are projections, and we have to be wary of projecting short term trends indefinitely into the future. The numbers look like a blip on the graph, and it seems weird to take that blip and extrapolate it out. But these forecasts are not just based on looking at such graphs and then extending the line of current trends. These are based on an industry analysis which includes projects that are already under way. So there is some meat behind these forecasts.

What are the factors that seem to be driving this current and projected increase in electricity demand? They are all the obvious ones you might think. First, something which I and other technology-watchers predicted, is the increase in the use of electrical vehicles. In the US there are more than 2.4 million registered electric vehicles. While this is only about 1% of the US fleet, EVs represent about 9% of new car sales, and growing. If we are successful in somewhat rapidly (it will still take 20-30 years) changing our fleet of cars from gasoline engine to electric or hybrid, that represents a lot of demand on the electricity grid. Some have argued that EV charging is mostly at night (off peak), so this will not necessarily require increased electricity production capacity, but that is only partly true. Many people will still need to charge up on the road, or will charge up at work during the day, for example. It’s hard to avoid the fact that EVs represent a potential massive increase in electricity demand. We need to factor this in when planning future electricity production.

Another factor is data centers. The world’s demand for computer cycles is increasing, and there are already plans for many new data centers, which are a lot faster to build than the plants to power them. Recent advances in AI only increase this demand. Again we may mitigate this somewhat by prioritizing computer advances that make computers more energy efficient, but this will only be a partial offset. We do also have to think about applications, and if they are worth it. The one that gets the most attention is crypto – by one estimate Bitcoin mining alone used 121 terra-watt hours of electricity in 2023, the same as the Netherlands (with a population of 17 million people).

Continue Reading »

Comments: 0

Mar 15 2024

What Is a Grand Conspiracy?

Ah, the categorization question again. This is an endless, but much needed, endeavor within human intellectual activity. We have the need to categorize things, if for no other reason than we need to communicate with each other about them. Often skeptics, like myself, talk about conspiracy theories or grand conspiracies. We also often define exactly what we mean by such terms, although not always exhaustively or definitively. It is too cumbersome to do so every single time we refer to such conspiracy theories. To some extent there is a cumulative aspect to discussions about such topics, either here or, for example, on my podcast. To some extent I expect regular readers or listeners to remember what has come before.

For blog posts I also tend to rely on links to previous articles for background, and I have little patience for those who cannot bother to click these links to answer their questions or before making accusations about not having properly defined a term, for example. I don’t expect people to have memorized my entire catalogue, but click the links that are obviously there to provide further background and explanation. Along those lines, I suspect I will be linking to this very article in all my future articles about conspiracy theories.

What is a grand conspiracy theory? First a bit more background, about categorization itself. There are two concepts I find most useful when thinking about categories – operational definition and defining characteristics. An operational definition is one that essentially is a list of inclusion and exclusion criteria, a formula, that if you follow, will determine if something fits within the category or not. It’s not a vague description or general concept – it is a specific list of criteria that can be followed “operationally”. This comes up a lot in medicine when defining a disease. For example, the operational definition of “essential hypertension” is persistent (three readings or more) systolic blood pressure over 130 or diastolic blood pressure over 80.

Continue Reading »

Comments: 0

Mar 12 2024

Pentagon Report – No UFOs

In response to a recent surge in interest in alien phenomena and claims that the US government is hiding what it knows about extraterrestrials, the Pentagon established a committee to investigate the question – the All-Domain Anomaly Resolution Office (AARO). They have recently released volume I of their official report – their conclusion:

“To date, AARO has not discovered any empirical evidence that any sighting of a UAP represented off-world technology or the existence a classified program that had not been properly reported to Congress.”

They reviewed evidence from 1945 to 2023, including interviews, reports, classified and unclassified archives, spanning all “official USG investigatory efforts” regarding possible alien activity. They found nothing – nada, zip, goose egg, zero. They did not find a single credible report or any physical evidence. They followed up on all the fantastic claims by UFO believers (they now use the term UAP for unidentified anomalous phenomena), including individual sightings, claims of secret US government programs, claims of reverse engineering alien technology or possessing alien biological material.

They found that all eyewitness accounts were either misidentified mundane phenomena (military aircraft, drones, etc), or simply lacked enough evidence to resolve. Eyewitness accounts of secret government programs were all misunderstood conversations or hearsay, often referring to known and legitimate military or intelligence programs. Their findings are familiar to any experience skeptic – people misinterpret what they see and hear, fitting their misidentified perception into an existing narrative. This is what people do. This is why we need objective evidence to know what is real and what isn’t.

I know – this is a government report saying the government is not hiding evidence of aliens. This is likely to convince no hard-core believer. Anyone using conspiracy arguments to prop up their claims of aliens will simply incorporate this into their conspiracy narrative. Grand conspiracy theories are immune to evidence and logic, because the conspiracy can be used to explain away anything – any lack of evidence, or any disconfirming evidence. It is a magic box in which any narrative can be true without the burden of evidence or even internal consistency.

Continue Reading »

Comments: 0

Mar 11 2024

Mach Effect Thrusters Fail

When thinking about potential future technology, one way to divide possible future tech is into probable and speculative. Probable future technology involves extrapolating existing technology into the future, such as imaging what advanced computers might be like. This category also includes technology that we know is possible, we just haven’t mastered it yet, like fusion power. For these technologies the question is more when than if.

Speculative technology, however, may or may not even be possible within the laws of physics. Such technology is usually highly disruptive, seems magical in nature, but would be incredibly useful if it existed. Common technologies in this group include faster than light travel or communication, time travel, zero-point energy, cold fusion, anti-gravity, and propellantless thrust. I tend to think of these as science fiction technologies, not just speculative. The big question for these phenomena is how confident are we that they are impossible within the laws of physics. They would all be awesome if they existed (well, maybe not time travel – that one is tricky), but I am not holding my breath for any of them. If I had to bet, I would say none of these exist.

That last one, propellantless thrust, does not usually get as much attention as the other items on the list. The technology is rarely discussed explicitly in science fiction, but often it is portrayed and just taken for granted. Star Trek’s “impulse drive”, for example, seems to lack any propellant. Any ship that zips into orbit like the Millennium Falcon likely is also using some combination of anti-gravity and propellantless thrust. It certainly doesn’t have large fuel tanks or display any exhaust similar to a modern rocket.

In recent years NASA has tested two speculative technologies that claim to be able to produce thrust without propellant – the EM drive and the Mach Effect thruster (MET). For some reason the EM drive received more media attention (including from me), but the MET was actually the more interesting claim. All existing forms of internal thrust involve throwing something out the back end of the ship. The conservation of momentum means that there will be an equal and opposite reaction, and the ship will be thrust in the opposite direction. This is your basic rocket. We can get more efficient by accelerating the propellant to higher and higher velocity, so that you get maximal thrust from each atom or propellant your ship carries, but there is no escape from the basic physics. Ion drives are perhaps the most efficient thrusters we have, because they accelerate charged particles to relativistic speeds, but they produce very little thrust. So they are good for moving ships around in space but cannot get a ship off the surface of the Earth.

Continue Reading »

Comments: 0

Mar 07 2024

Is the AI Singularity Coming?

Like it or not, we are living in the age of artificial intelligence (AI). Recent advances in large language models, like ChatGPT, have helped put advanced AI in the hands of the average person, who now has a much better sense of how powerful these AI applications can be (and perhaps also their limitations). Even though they are narrow AI, not sentient in a human way, they can be highly disruptive. We are about to go through the first US presidential election where AI may play a significant role. AI has revolutionized research in many areas, performing months or even years of research in mere days.

Such rapid advances legitimately make one wonder where we will be in 5, 10, or 20 years. Computer scientist Ben Goertzel, who popularized the term AGI (artificial general intelligence), recently stated during a presentation that he believes we will achieve not only AGI but an AGI singularity involving a superintelligent AGI within 3-8 years. He thinks it is likely to happen by 2030, but could happen as early as 2027.

My reaction to such claims, as a non-expert who follows this field closely, is that this seems way to optimistic. But Goertzel is an expert, so perhaps he has some insight into research and development that’s happening in the background that I am not aware of. So I was very interested to see his line of reasoning. Will he hint at research that is on the cusp of something new?

Goertzel laid out three lines of reasoning to support his claim. The first is simply extrapolating from the recent exponential grown of narrow AI. He admits that LLM systems and other narrow AI are not themselves on a path to AGI, but they show the rapid advance of the technology. He aligns himself here with Ray Kurzweil, who apparently has a new book coming out, The Singularity is Nearer. Kurzweil has a reputation for predicting advances in computer technology that were overly optimistic, so that is not surprising.

Continue Reading »

Comments: 0

Mar 04 2024

Climate Sensitivity and Confirmation Bias

I love to follow kerfuffles between different experts and deep thinkers. It’s great for revealing the subtleties of logic, science, and evidence. Recently there has been an interesting online exchange between a physicists science communicator (Sabine Hossenfelder) and some climate scientists (Zeke Hausfather and Andrew Dessler). The dispute is over equilibrium climate sensitivity (ECS) and the recent “hot model problem”.

First let me review the relevant background. ECS is a measure of how much climate warming will occur as CO2 concentration in the atmosphere increases, specifically the temperature rise in degrees Celsius with a doubling of CO2 (from pre-industrial levels). This number of of keen significance to the climate change problem, as it essentially tells us how much and how fast the climate will warm as we continue to pump CO2 into the atmosphere. There are other variables as well, such as other greenhouse gases and multiple feedback mechanisms, making climate models very complex, but the ECS is certainly a very important variable in these models.

There are multiple lines of evidence for deriving ECS, such as modeling the climate with all variables and seeing what the ECS would have to be in order for the model to match reality – the actual warming we have been experiencing. Therefore our estimate of ECS depends heavily on how good our climate models are. Climate scientists use a statistical method to determine the likely range of climate sensitivity. They take all the studies estimating ECS, creating a range of results, and then determine the 90% confidence range – it is 90% likely, given all the results, that ECS is between 2-5 C.

Continue Reading »

Comments: 0

Next »