Archive for: August, 2010

Some true statements about Palin, Limbaugh

Aug 27 2010 Published by under Politics, Rant

This editorial by Timothy Egan has inspired me to make the following true statements. We'll see if Fox News picks them up and repeats them with the sense of sky-falling urgency that they repeat other things.

  • I have not seen definitive evidence that Sarah Palin's husband wasn't a member of the KKK, dropping out only when she was invited to join John McCain's presidential ticket.
  • Rush Limbaugh has not given us proof that he wasn't secretly a member of the American Communist Party in his youth.

I mean, it's only American and responsible to ask the questions, right?

2 responses so far

Essential Science Fiction Movies

Aug 25 2010 Published by under Nerdism, Science Fiction

io9 is doing a series on Science Fiction for Beginners. It includes a post today by Charlie Jane Anders, 25 classic science fiction movies that everybody must watch.. It's a good list.

I am embarrassed to admit that there are a couple of movies on the list I haven't seen. (No, I haven't seen Metropolis yet, and I realize that makes me culturally illiterate. Nor have I seen Planet of the Apes, Road Warrior, or Ghost in the Shell.) I'll have to make a point to see them.

I do agree with Anders about Brazil— when forced to list a favorite movie of all time, usually that's the one that I list. I'm also happy that both a Star Trek and a Star Wars movie made the list, because those movies (the second in each series) were good movies; sometimes people are too self-consciously highbrow to include something from a mass-market franchise.

I do have to quibble with what Anders says about Back to the Future. A very fun movie, mind you, but I wouldn't say that it's theory of time travel really makes all that much sense. I suppose it does apply the theory consistently, but it was definitely a "fantasy" theory of time travel. A science fiction movie that I think is great and that I'd include (along with Primer) as one of the two "essential" time travel movies is 12 Monkeys. (Which, I believe, was produced by Terry Gillam, who also did Brazil.)

Other movies that I would have considered for the top of the list include Gattaca (probably the most sensible treatment of the social spectre of genetically engineering our kids), The Truman Show, and maybe, just maybe, Buckaroo Bonzai, as the definitive and most rewatchable treatment ever of camp.

9 responses so far

Radioactive decay rates... decreasing... because of... the Sun????

When I see something like this on Slashdot, I figure it's the usual crap science that somebody picked up. Only the press release it links to is from Stanford, which is normally what we think of as a respectable institution.

The basic idea is that tiny decreases in the radioactive decay rates of some isotopes have been observed. Presumably, these were statistically significant decreases, although I don't have details. One case of this seemed to correlate in time with a solar flare, and other cases seem to vary annually in ways that suggest that maybe, somehow, Solar neutrinos are interacting with these isotopes and influencing the decay rates.

I'm not going to believe this until I see strong evidence for it and until multiple groups have confirmed it. It would be cool if it were true, for it would tell us that neutrinos are interacting with other matter in ways that we didn't expect. But, for now, all I've been able to find are two papers (here and here). One is from a conference proceedings (and I've only seen the abstract); the other is a sort of response that has only been uploaded to the preprint server. In other words, as best I can tell, neither of these papers has yet been through any kind of peer review.

The latter paper— by Parkhomov, on the preprint server— has the full text available, although I have to admit I haven't read it. The abstract suggests, however, that he does not observe the effect mentioned in the conference proceedings.

So, we've got two papers: a conference proceedings, and a paper only uploaded to a preprint server, the latter contradicting the former. As such, I'm not going to get all excited about this until the paper trail gets a little bit more solid.

My prediction: this is going to go away and not turn out to be a real effect. But, I guess we should keep our eyes open in case it does turn out to be real. It would surprise the heck out of me if it were real, though.

6 responses so far

The Difference Between Religion and Woo

In one of my first couple of years as a physics professor at Vanderbilt, fellow astronomer David Weintraub introduced me to another faculty member we ran into at lunch. He was from one of the humanities departments— I forget which. When David introduced me as somebody who worked on measuring the expansion rate of the Universe, this other fellow's immediate response was that the only reason we astronomers believed in the Big Bang theory was because of our Judeo-Christian cultural bias that there was a moment of beginning.

I was quite taken aback. I tried to talk about the Cosmic Microwave Background, light element ratios, and so forth, but he waved them all off. I mentioned that his assertion wasn't even historically correct: earlier in the 20th century, the steady-state model (the Universe has always been as it is now) was if anything the dominant cosmological model. His response to hearing the postcard description of the Steady State Universe: "I like that one better." Scientific evidence be damned....

It was really quite an eye opener. I had run into a living stereotype of the post-modernist deconstructionist, who believes that absolutely everything is a social construction. He had quickly judged the intellectual output of a field of study he was ignorant about based on his own bias and methodology. While I suspect that scientists have overreacted to post-modern deconstructionism, this fellow showed me that at least some of what we overreact to is real. There are those who have convinced themselves that absolutely everything is a social construction. Thus, the only people who are studying what really matters is those who deconstruct said social constructions; everybody else is ultimately fooling themselves and playing around with their "science" and so forth while ultimately being trapped by their cultural blinders. Of course, this is a load of hogwash, and I am led to understand it's not even really what most post-modern deconstructionist types really believe.

Why do I mention this? Because I see a lot of those who call themselves skeptics making exactly the same mistake— judging another field of intellectual inquiry on what they believe to be the one true way of reason. They dismiss things as trivial or childish based on criteria that fail to be relevant to the field of human intellectual activity they're trivializing. Specifically, there are a lot of people out there who will imply, or state, that the only form of knowledge that really can be called knowledge is scientific knowledge; that if it is not knowledge gained through the scientific method, it's ultimately all crap.

When I was in first or second grade, I wrote a story about a boy named Tom Tosels who found a living dinosaur. It was very exciting. It was also, well, a story written by a 7-year old, and not one who was particularly literarily talented. Now, from a purely scientific basis, it's difficult to distinguish this story from the poetry of Robert Frost. It's words, written on a page, out of the imagination of a person (a person named Robert, even), telling a fictional story. What makes Robert Frost so much more important to human culture than the stories I wrote when I was 7? It's not a scientific question, but it is a question that is trivially obvious to those who study literature, culture, and history. And, yet, using my 7-year-old story to dismiss all of literature as crap makes as much sense as using the notion of believing in a teapot between Earth and Mars as a means of dismissing all of religion.

If you cannot see the difference between Russell's teapot and the great world religions, then you're no more qualified to talk about religion than the fellow who thinks that cultural bias is the only reason any of us believe in the Big Bang is qualified to talk about cosmology.

Phil Plait has written three blog posts on his famous "Don't Be a Dick" speech to TAM, a meeting of skeptics. (The posts are here, including a video of the talk, and here, including links to bloggy reactions to the talk, and here, including personal reactions to the talk.) Some of the comments on the posts— including, ironically, many of those who accuse Phil of being too vague and denying the effect he discusses really exists— are excellent illustrations of what he's talking about. Some of these comments (and even some comments that are supportive of his general message) illustrate the philosophical blinders that you find on many in the skeptic movement. In the third post, there is a picture of Phil hugging Pamela Gay, a prominent pro-science speaker, a leading light of the skeptic movement... and a Christian. There are a number of responses that express the sentiment of commenter Mattias:

When will we see Phil hugging a medium — calling for us to include them in our mutual skepticism about moon-hoaxers, homeopathy or, lets say, dogmatic religion?

There are quite a number of skeptics who openly say that they cannot see the difference between religion and belief in UFOs, Homeopathy, or any of the rest of the laundry list of woo that exists in modern culture. Even those who agree that ridiculing people for their beliefs is not only counter-productive, but just bad behavior, often don't seem to think there's any difference between the brand of religion practiced by Pamela Gay (or by myself, for that matter) and Creationism, or even things like UFOs, mystical powers of crystals, psychic powers, and so forth. The assertion is that being religious is a sign of a deep intellectual flaw, that these people are not thinking rationally, not applying reason.

It's fine to believe this, just as it's fine to believe that the Big Bang theory is a self-delusional social construction of a Judeo-Christian culture. But it's also wrong. Take as a hint the fact that major universities have religious studies and even sometimes theology departments (or associated theology schools, as is the case with Vanderbilt). Now, obviously, just because somebody at a university studies something, it doesn't mean that that thing is intellectually rigorous. After all Cold Fusion was briefly studied at universities, and ultimately it was shown that there was basically nothing to it. But it should at the very least give you pause. The fact that these studies have continued for centuries should suggest to you that indeed there must be something there worth studying.

Creationism is wrong. We know that. But the vast majority of intellectual theologians out there would tell you that creationism is based on a facile reading of Genesis, a reading that theology has left as far behind as physics has left behind the world-view of Aristotle.

Astrology is bunk, because it makes predictions about the world that have been shown to be false. Likewise, Creationism is bunk, because it makes statements about the history of the world and the Universe that have been shown to be false. But religion in general, or a specific instance of one of the great world religions in particular, are not the same thing. It is true that lots of people use religion as a basis for antiscience. But there are also lots of people like Pamela and myself who are religious, and yet fully accept everything modern science has taught us. There are people— theists— who study those religions whose studies are based on reason and intellectual rigor that does not begin with the scientific method. Yes, there is absolutely no scientific reason to believe in a God or in anything spiritual beyond the real world that we can see and measure with science. But that does not mean that those who do believe in some of those things can't be every bit as much a skeptic who wants people to understand solid scientific reasoning as a card-carrying atheist. Pamela Gay is a grand example of this.

Don't be like the post-modernist so blinded by how compelling his own mode of thought is, that you come to believe that the only people who are intellectualy rigorous and not fooling themselves are those who use exactly that and only that mode of thought.

43 responses so far

What to do about overproduction of PhDs?

Aug 19 2010 Published by under [Education&Careers], Academia, Rant

There is an interesting and anguishing post on Inside Higher Ed by psychology professor Monica J. Harris entitled Stop Admitting Ph.D. Students. (Hat tip: Chad.) She describes a problem familiar to anybody who's paid attention to the PhD market in probably just about any academic field in the last couple of decades. Departments continue to admit and produce PhD students, and college administrations (and rankings by professional societies) judge departments partly on their ability to produce large numbers of PhD students. Yet, there are very long-term jobs out there for people with PhDs. Knowing that society and her department isn't going to change to address the problem, she's tried to do what she thinks is the only ethical thing she can: she's no longer accepting new graduate students into her lab, so that at least she personally won't be contributing to the oversupply problem.

The comments are also very interesting. The range from agreement and sympathy to outright claims that she is lazy and "not doing her job." I think the best comment was made by "scandal and a byword":

Many of us PhD students DO know what we're getting into. The problem is that (at least in my experience) we're strongly discouraged from making contingency plans. I get a fairly explicit mixed message from my teachers:
1) There aren't many good (tenure-track research) jobs out there.
2) If I don't get a tenure-track research job, I'm a failure, and my name will ever be a scandal and a byword and a source of discomfort to my teachers. If I have any plan B, I'd better not mention it!

My own field is physics, and the problem of physicists being trained for and expected to get tenure-track faculty positions, without enough of these positions being out there, has been a sore topic for two decades (at least). My last year or two of college (1989-1990), I remember reading a national report about how there was going to be a "shortage of scientists". This was based on a rather naive consideration that the boom of scientists who went into the field after Sputnik were all about to retire. In reality, the tech push after Sputnik created a system whereby a tenure-track or tenured physics professor at a research institution produces during his career something like 10-15 PhD students. In other words, while he will retire only once, he replaces himself 10 to 15 times. At first, this worked, because there was demand for that level of expansion. But not for long. Even considering that some will go to smaller, undergraduate-only colleges, this level of over-replacement is not sustainable.

By 1991 or 1992, far from the "shortage of scientists" talks, there were regular columns and letters to the editor in Physics Today talking about how physics graduate students could usually get post-doctoral positions, but it was very tough for those post-docs to move on to a faculty position. At one point, one of Caltech's colloquium periods (perhaps it was Astronomy journal club-- I don't remember exactly) was given over to a discussion of this topic. One of the things parroted there, as in many of these articles, was that we need to be training our PhD students also for jobs outside of academia. Professors said this... but I almost hear each professor present thinking, "but my students will be the ones to get those coveted faculty positions." (Or perhaps it was "but Caltech students will...".)

At least in physics, and at an institution like Caltech, there is a very strong cultural sense that "success" means "ending up in a tenure-track faculty institution at a research University". When, in grad school, I would despair with my friends about our chances, I would sometimes mention that I was as or more interested in teaching than primarily in research, they would say, oh, well, you can get a job at a small liberal arts college! Of course, those jobs are just as competitive as the research jobs. Yes, sometimes people "settle" for those jobs, but the truth is that there are a bunch of us who really value teaching as a primary professional, intellectual, and creative activity.

I also remember hearing students talking about PhDs who had gone on to teach high school, and how depressing that was that they'd have to settle for so little. At the time, I was seriously considering that as a long-term possibility, but I didn't say anything. And this comes back to the comment of "scandal and a byword" above: the culture of PhD granting institutions in many fields remains extremely destructive to the notion of PhDs being self-respecting individuals if they don't get one of the very few coveted faculty jobs.

Many of the comments on thread note that cutting off the opportunity for people to get PhDs cuts off the opportunity for the people who value the PhD work itself. This is a valid point. What I tell people is that if they're going to go to graduate school in physics or astronomy, they should do so because they want to go to graduate school. There is absolutely no guarantee that the PhD will allow them to spend the rest of their lives in physics research. With their skills, the PhD is a more stressful and lower-paying occupation (*) than other things they could be doing. If the coveted faculty job were likely, it might be worth the "sacrifice" of going through a PhD program, but because that faculty job is not likely, the PhD has to be worth it all by itself.

(*) (Aside: in physics, it's a lot better than it is in the humanities. You generally teach for a couple of years, and most of the time your advisor has grant money to pay you a research assistantship to complete your PhD research. In the humanities, you may have a fellowship for a few years, but it's more common to have to teach for many years, or to have to do research assistantships that are not your own thesis research. Yes, you're being paid a pittance in physics, but at least you're being paid.)

You also need to be aware that you're going to receive direct and indirect pressure to consider "success" as going on in research. Even the pep talks about how great a given graduating class is will come across as pressure: "I'm sure you'll go on to do great things to advance the field!" It's supposed to be a compliment, but it bolsters the culture that success is going on in research. You have to be aware of this, and have to be aware that you're still a good person, still a good PhD, and still contributing to society even if you don't manage to go on, or if, horrors, you choose not to go on in research.

The whole culture of the system is broken, and I don't see it changing any time soon. We've been collectively wringing our hands about it for at least a couple of decades, but the evaluation criteria for ranking departments remains "more PhDs" rather than "a responsible number of PhDs", and administrations at Universities continue to pressure departments to produce lots of PhDs to make their numbers look good. How we each respond to this ethically is difficult; I admire Monica Harris' response, and am dismayed by those who think she's finding an excuse to be lazy. Myself, I think the most important thing is to make sure that undergrads going on to PhD programs are not fed a line about a "shortage of scientists", and are fully aware of what they're getting themselves into.

20 responses so far

Chad on the need for math in everyday life

I just wanted to point a link over at Chad Orzel's blog post Algebra and Circuit Breakers. He gives an example of how "mathematical reasoning"-- that is, the reasoning used to factor a polynomial-- is the same kind of reasoning you need to solve some situations that can come up in everyday life.

Comments are off for this post

The Astronomy Decadal Survey

Every 10 years, a committee of astronomers gets together to solicit input from astronomers across the United States. This committee then sets forth a set of recommendations for the priorities in funding astronomy programs for the next ten years. By and large, the USA astronomy community buys into this effort, and accepts as a group the committee's recommendations as the recommendations of the consensus of the whole astronomy community.

Today (2010 Aug 13), the report for the 2010 decadal survey was released. Places to find information about it:

If you scroll down on the first page linked above, you can find the full text of the survey online for free. (I have just glanced at it, and it appears to be scanned images, at perhaps not sufficient resolution.)

So far, I've only looked at the presentation. There's not a lot surprising in here. It lists the primary driving science goals, which are things that are already the current "holy grails" of astronomy. It includes detection of gravitational waves (which is strictly a physics issue, but which can hopefully then be harnessed for astronomy), understanding the first generation of stars that caused the reionization of the Universe, understanding the Big Bang and also the present epoch of cosmic acceleration, and finding Earthlike planets outside our Solar System (but still within our Galaxy).

The funding recommendations are specific, at least for large projects. On the space side, the top priority is WFIRST, a wide-field infrared survey telescope, which would be used both for signatures of probing cosmic acceleration and for finding exoplanets (as well has being a "general use" infrared telescope that would complement the JWST-- the JWST already being under construction.)

The second priority is one I applaud: reinvigorating a previously existing NASA program of "explorer class" missions. These are small and mid-size space missions which don't have the cost of something like HST, JWST, Spitzer, Chandra, or WFIRST. Some of these missions have been extremely productive, and I'm glad to see the report listing these. I haven't read the full text for the justification of it, but I suspect the the flexibility for responding to new opportunities that come with new discoveries, together with the Explorer track record, are key.

After that are LISA, a space mission that will detect gravitational waves and really make gravitational wave astronomy possible, and then a powerful international X-ray telescope.

On the ground based side, the budget scenario is more depressing. While the presentation linked above seems to believe there is a decent chance that NASA will be able to fund the top priorities, the ground based large intiatives are more sketchy. There is this ominous statement: "In event NSF budget is as predicted by agency, there can be no new starts without closure of major facilities following senior review."

What are the new starts? Two things are listed. The first is LSST, which has been the bandwagon of astronomy for several years already. It's going to be an impressive project, an 8m telescope in Chile that will survey the entire night sky in four different colors once every four days. This is going to produce an utterly mind-bogglingly huge amount of public data-- and indeed, some of the technical challenges of LSST involve effectively dealing with all of that data. This is going to be an impressive data set that will be able to do a whole lot.

It won't however, be able to do everything. I have heard some astronomers say "the LSST will do everything". Sometimes they're theorists, but often they ought to know better. Yes, the LSST is going to be an amazing dataset that will "just do" some of what people do in special targeted projects right now. But there's a whole lot that it's not going to do by itself. I already know that there are astronomers out there (like, say, me) who are worried that bread-and-butter facilities used by lots of astronomers, especially astronomers who aren't at a Caltech or a Harvard (i.e. an institution with their own private telescopes), will be sacrificed on the altar of the LSST. (The key players in which you can be sure are going to come predominantly from institutions that also have their own private mid-size telescopes.) I really hope this doesn't have to happen. There is something called the "MREFC" -- Major Research Equipment and Facilities Construction -- an NSF budget thingy, the politics and economics of which I am clueless about. Ideally, LSST (and the other major projects) are going to be at least partially funded out of this, so that they don't have to eat up the entire NSF astronomy budget (leaving people who aren't key players on those huge projects completely in the cold).

The second major project mentioned is participating in one of the efforts to build a thirty-meter class "segmented mirror" telescope. This is a telescope like the Keck telescope, only with three times the diameter. Whereas the LSST will be surveying the entire night sky every four days, this giant telescope will be used for targeted observations of the most difficult targets requiring the best light-gathering power possible.

I've left out quite a number of projects in this brief driveby. Take a look at the presentation... and if you have a whole lot of time to blow, you can always read the entire report.

As astronomers always say with great optimism when one of these things come out, "it's going to be an exciting decade for astronomy." I have to admit, though, that with ongoing financial crises that don't seem to be recovering as fast as we'd hoped, coupled with the sure knowledge that in coming decades there's going to be ever more economic, political, and humanitarian turmoil as a result of anthropogenic climate change, that I won't be surprised if over the the public starts to lose patience with pure science in the face of increasingly urgent crises (that are upon us because we spent so much time ignoring science).

Update: for a more thorough summary, see what Julianne has written at Cosmic Variance here, here, and here.

Comments are off for this post

Argument from Authority vs. Trusting Experts

Some folks who argue against anthropogenic climate change argue that folks like me who accept the evidence that it's happening and it's something we should worry about are guilty of bad science. Specifically, that we're accepting arguments from authority, rather than evaluating the evidence.

While argument from authority works in some lines of reasoning, it's anathema to science. Science usually proceeds by starting from a set of assumptions or postulates, and seeing what results-- but those assumptions and postulates are always subject to test, and if experiment or observations show that they're wrong, they have to be tossed out. We believe something is true in science because the experiments or observations have show it to be true, not because some designated authority has asserted that this is how things are.

However, if you perform reducto ad absurdum on this argument, most of us have no right to accept the vast majority of the scientific knowledge that the human race has amassed. Have you, personally, verified Einstein's theory of Special Relativity? OK, I have seen the moons of Jupiter making their way around Jupiter, so I've confirmed Galileo's observation disproving Geocentricity... but have you? And if you haven't... what right do you have to assert to Geocentrists that they're full of it, and that the center of mass of the Solar System is really close to the Sun? Huh? Huh?

Over at the RealClimate blog, a guest commentary by Anderegg et al. make this point in a way that struck me as rather nice:

We accept and rely upon the judgment and opinions of experts in many areas of our lives. We seek out lawyers with specific expertise relevant to the situation; we trust the pronouncement of well-trained airplane mechanics that the plane is fit to fly. Indeed, the more technical the subject area, the more we rely on experts. Very few of us have the technical ability or time to read all of the primary literature on each cancer treatment’s biology, outcome probabilities, side-effects, interactions with other treatments, and thus we follow the advice of oncologists. We trust the aggregate knowledge of experts – what do 97% of oncologists think about this cancer treatment – more than that of any single expert. And we recognize the importance of relevant expertise – the opinion of vocal cardiologists matters much less in picking a cancer treatment than does that of oncologists.

They don't even reducto to as absurdum a point as I did-- whereas I was talking about replicating the experiments yourself, they're just talking about reading the primary literature. Of course, in reading the primary literature, you're already taking some things on faith. (Little-f faith, not big-f Faith.) Specifically, you're trusting the ethics and competence of the investigators who performed and confirmed the experiments. You're trusting that it's not one big collusion and conspiracy amongst the writers of the primary literature to promulgate a falsehood on the rest of the world.

We do that constantly, every day, and it's only rational to do that. This includes climate change. The vast majority of people who know anything about climate change are convinced about the existence of anthropogenic climate change, and that it's a problem. The details and the severity of the problem remain under debate of course, but the consensus that there's something to worry about is very strong. Accepting and acting on their expertise is not resorting to an argument from authority; it's just trusting the experts to know their field of expertise. Saying that we shouldn't advocate national and global response to the problem of global warming without each of us individually verifying the evidence ourselves is tantamount to saying that it is unwise to get on an airplane without learning enough to verify the mechanical fitness of the plane first.

5 responses so far

The view off my balcony

I just moved from Nashville, TN to Squamish, BC, where I'm starting teaching at Quest University.

Moving is always painful. There's the administration of it all, of course, and the sadness of leaving friends and community behind. And, there's all the boxes, the packing, the unpacking. This move is complicated by the fact that we're moving into a much smaller place (housing costs in Squamish are much higher than in Nashville!). We got rid of a lot of stuff in Nashville, but getting everything unpacked is still turning out to be a bit of a puzzle.

There are some advantages, though. Squamish is in a beautiful location in British Columbia, on the highway between Vancouver and Whistler. There's this massive cliff face (called "The Chief") overlooking the town-- and a harbour on the other side. We're in an apartment building, and below is a picture I took at 8:20 PM yesterday from our balcony. It had been a cloudy day, but it wasn't hazy (as it had been the previous day). Some of the low clouds were hovering below the height of the Chief, which made for quite an impressive sight.


Click to embiggen a bit

5 responses so far

Why Our Universe is Almost Certainly Somebody's Simulation

Despite the title, this blog post is really here as a warning to others about trying to reason from first principles about the nature of our Universe...

We have already done sophisticated simulations ourselves that do the large scale simulation of structure in our Universe. Yes, this is for a small fraction of the total Universe, and, yes, it only simulates down to galaxy size. (That is, it doesn't simulate small enough areas to model the formation of stars or anything like that.) But the point remains that we're simulating the Universe.

We've also done "artificial life" simulations, in which we've shown that computer programs evolve. One of the most interesting physics colloquia I heard as a grad student at Caltech in the first half of the 1990's was about the Tierra project, that created an "environment" in which small computer programs competed for resources. It had a mechanism for random mutation, and over time the programs evolved to become more efficient.

Take these simulations, and allow for computer power to continue to improve as we've seen it improving in recent decades, and it really doesn't seem too much of a stretch to imagine that we could create a simulation of the local area of a galaxy that has the computer power necessary to evolve very complex "organisms"-- simulated organisms, that is. Code in the basic laws of physics, and give them enough space and computer power to evolve, and it could just happen.

Now, consider our Universe as we've observed it to be. We only know of one life-bearing planet, but we do know that there are lots of other planets in our Galaxy. And, looking out there, within the observable Universe (not even considering things so far away that light hasn't had time to reach us since the Big Bang), there are something like 100 billion galaxies like ours. Given how tenacious life is on this planet once it got started, even if it's rare for it to get started (say even only one or a few instances in our own Galaxy), there are certainly other planets out there with life on them.

If we use our statistical sample size of one, so far we see that a few hundred years out of a few billion years of evolution includes a technological civilization. That's a small fraction... but given the number of stars in our Galaxy, and the number of galaxies in the observable Universe, it means that there are other technological civilizations out there, somewhere. Let's assume that an appreciable fraction (i.e. anything more than an infinitesimal fraction) of these civilizations eventually are able to produce computation able to make the kinds of simulations I'm talking about above. It's been for the last 7 billion years or so that it's reasonable to suppose that life could arise on planets like our own. During those 7 billion years, there have almost certainly been lots of these simulations run. (What does "lots" mean? Well, the numbers going into this are very uncertain, of course, but it's probably somewhere between hundreds and billions.)

In other words: for one observable Universe like our own, in which intelligent creatures could arise, there are many simulations in which intelligent creatures could arise. Thus, any given civilization of intelligent creatures is by far more likely to be within one of the simulations rather than in the real Universe.

So, we're probably all part of somebody's simulation.

Right. Do I really believe that? No. I mean, maybe, but if so, so what?. If the simulation were done well enough, though, we'd have no way to tell the difference. So, at some level, trying to decide if we're somebody's civilization or if we're in a real Universe is mental masturbation. We see a Universe out there which has physical laws that are Universally obeyed. The process of science has a great track record in explaining how this Universe works and predicting what we will observe. Hence, it makes the most sense to go with the simplest explanation, that there is a real Universe and we are working on understanding it. If certain things about our Universe seem improbable from first principles-- for instance, why are the densities of Dark Matter and Dark Energy so close?-- it's worth thinking about whether that's a pointer to something deeper. But, at some level, the Universe is what it is, and it's worth trying to understand it without getting hung up on fundamental probabilistic arguments that lead us to thinking that none of it means anything anyway.

13 responses so far

Older posts »