Category Archives: Swiftocracy!

Swiftocracy!: Attack of the Toads

For the last two weeks each of my posts have been based off requests. For more information about how that happened, look here.

“Write about Horny Toads and their ability to defend themselves from predators.”


Horny Toads are not toads.

But they do have an abundance of horns.

Horny Toad is a colloquial term for several species of Horned Lizard, all of whom are native to North America. They are not amphibians. Many of them live in the desert, though some can be found in the forests of Idaho and southern Oregon, and in Colorado. They’re small, and fat, and covered in spikes that make then more charismatic than your average fat lizard. They got the name Horny Toad because they look a lot like toads. They have a short, wide snout and a short, fat body. In addition they will inflate their bodies when threatened, which certainly seems to be toadlike behavior. I mean look at this thing:



I wouldn’t blame you for calling it a toad.

Horny Toads, despite their fiercesome appearance, are actually quite small and probably quite tasty. They also don’t move too fast. The Horny Toad hunts by sitting very still and eating any ants that walk by it’s mouth. In order to survive it utilizes four distinct lines of defense.

1. Camouflage. Their dull earth tone scales and bumpy exterior means that this lizard blends right into the landscape while it waits for ants. If you can’t see him then you can’t eat him. Of course camouflage isn’t perfect. When a hungry coyote or bobcat spies him sitting on a rock he’ll have to try…

2. Inflation. The Horny Toad will inflate its body somewhat when threatened. This makes him look bigger, spikier, and can be a little surprising. The hope is that whoever is bothering him will get scared off. If this fails to impress he can always rely on…

3. His spikes. He’s a prickly little critter who may hurt going down. Some predators will be put off by this. If one tries to snatch him up he’ll usually lean down on one side to keep their jaws from getting a grip on his scaly hide. If this doesn’t work he has only one more trick up his sleeve, and it’s a real doosy.

4. Blood shooting eyeballs.

That’s a pretty strange defensive ability, I must admit.

Though it seems like something out of a prospectors tall tale, the fact is that Horny Toads really can shoot blood out of their eyeballs. Well, out of ducts close to the eyes anyway. They can shoot blood up to five feet, which is pretty frightening to a hungry coyote. It’s just plain surprising. Predators are no stranger to blood, but they don’t expect to go spraying until after they start biting. In nature you don’t get second chances, so most animals are wary of anything too surprising. As an added bonus there’s something in the blood that stinks to high heaven if you’re a canine or feline. It makes a predator wonder: why bother trying to eat this freaky stink blood shooting spiky thing when there are plenty of perfectly normal groundhogs around to munch on?

What is most surprising to me is how they shoot the blood out. I was ready to believe that they just have some natural little blood cannon ducts in their eyeballs that fill up with blood and squeeze them out like a super soaker. The truth is that they somehow reduce the flow of blood leaving their head. This builds up pressure until the ducts in their eyes literally explode, the blood vessels bursting outward in a spray of blood. That sounds painful! Imagine if you could do that.

Now stop before you give yourselves nightmares.


Swiftocracy!: Foundation for the Young Earth

For the last two weeks each of my posts have been based off requests. For more information about how that happened, look here.

“Explain your top 5 most convincing evidences that the earth is relatively young.”

Let’s not dive right into this. Let’s take a little detour first, by unpacking a little syllogism.

1. If the world was created by an non-natural process then science would never discover that fact.

2. The Christian believes that the universe was created in an non-natural process.

3. Therefore, if the Christian is correct, science will never accurately understand how the universe came to be.

If God created the universe than whatever theories the scientists come up with about the origin of the universe will be fundamentally flawed. This is because science works under the idea of “methodological naturalism.” Naturalism, as I’ve mentioned before, is the idea that there is nothing in the universe besides mass, energy, and combinations of the two. No ghosts, gods, souls, or spirits. Methodological naturalism means that as far as performing science is concerned you must behave as is naturalism is correct. Another way of putting it is to say that science is the study of the natural world and must seek natural answers for natural events. If an event is non-natural then science can’t give us a definitive answer about its nature. If a miracle occurred you could not discover it through the scientific method: the best you could do is determine that there is no current scientific explanation for the phenomenon in question. There is nothing wrong with this. Science is a tool that exists to study the world as it naturally occurs. That is its purpose.

Some people take it to mean that because science as a tool must be methodologically natural, and because science has been very successful as a tool, then naturalism must be true. This is an error.  The scientific method by definition can’t tell us anything about the existence of entities or phenomena that are outside of nature. If you want to find out whether such things exist then you must find your evidence for or against outside of science. It’s foolishness to say that the sun does not exist because you cannot sense it with a seismograph or measure it with a voltmeter; after all, they’re the wrong tool for the job. Belief in the sun is not discredited no matter how accurately a voltmeter measures electrical current. In the same way belief in the supernatural is not discredited no matter how well science measures and predicts natural phenomenon.

Now this belief in the supernatural should be understood to be something distinct from what we might call “superstition.” Most superstitions make claims on how the natural world works, and thus fall under the domain of science. The belief that warts can be cured by washing with water from a silver vessel under the light of the full moon, for example, is making a natural claim: if you do X thing then result Y will occur. There is no problem with science proving such a belief to be false because it is a belief about natural events. Science has been effective at disproving superstitions, and because many people conflate the two concepts some believe that science has disproved the supernatural as well. This is a mistake. Superstitions make natural claims about natural phenomena (if you break a mirror you’ll have seven years of bad luck, animals can talk at midnight on Christmas Eve, hearing a screech owl three times in a night means someone will soon die, etc). The supernatural is a belief in non-natural phenomena (free will, a soul, God, objective morality, etc.). To continue the earlier metaphor, a voltmeter is very effective at discovering inaccurate beliefs about voltage, but it can’t tell us anything about justice.

All of this is a long way of making two little points. The first is that science, by definition, cannot prove or disprove the existence of God. The second is that if God was responsible for creating the universe, regardless of whether he did it in 6 days or not, then science alone will never give us an accurate picture of the universe’s origin. It simply can’t. Unless the universe has a purely natural explanation science is helpless to understand how it came to be. If, then, you believe that God does exist and that he did create the universe, then you must agree that the “scientific” explanation is not entirely correct. The Young Earth Creationist (YEC) and the Theistic Evolutionist should agree on this point. The main difference between the two positions is how much trust someone is willing to place in the scientist’s flawed model. Is the secular model only wrong in a few details, or completely inaccurate?

Many Christians put a lot of trust in the ability of scientists to understand what happened in the unobserved past. I would argue that much of this trust is misplaced. We can have absolute confidence in scientific theories that can be observed and experimented on today. This absolute confidence should not be extended to theories that are by their very nature unobservable and impossible to experiment upon. I laid most of this out in a previous post, so I won’t go on about it too much here. The essential point is that trying to understand events that occurred in the unobserved past is very much like trying to solve a murder. In a criminal investigation forensic (that is, scientific) evidence is important but ultimately is secondary to eyewitness testimony and deduction. CSIs can perform experiments on the physical aspects of a crime scene but they cannot experiment with the event itself. That’s why we have detectives. Science can tell us a lot about the physical artifacts we find buried in the ground, but we rely on historians to piece together what actually occurred in ancient times. In the same way scientists can examine the world around us today but they cannot examine or experiment on the past itself. The best they (and the best anyone) can do is to devise models of the past that best explain the most facts about the observable present. In other words, the theory that the world is billions of years old is, by its very nature, not as definite or reliable as the theory that water boils at 100 degrees Celsius at sea level. One has been tested and can still be tested today. The other is a model that has changed many times in the past, and will continue to change as more facts are discovered that must be accounted for.

However, as previously established, YEC is not really a scientific model as it does not follow meet the useful scientific requirement of being methodologically naturalistic. However that does not mean it is an invalid model, just that it is arguably a non-scientific model. Just because it is “unscientific” does not mean it isn’t true. All of history consists of models that are non-scientific in nature because they deal in events that cannot be observed. A historical approach that is methodologically naturalistic begs the question of whether supernatural events have ever occurred in history.  The purpose of historical investigation is to determine whether an event occurred, and how; it is not to understand how the natural world works when left to its own devices. They are different fields that use different tools and have different purposes. Science’s purpose is to create a model of the natural world. History’s is to make a model of the events that have occurred on this world. YEC is in this sense a competing model to the more traditional understanding of the age and origins of the Earth.

We cannot prove either model definitively, but we can compare the two and see which model best explains the world as we know it now. Therefore the best “evidences” in favor of YEC consist of things we can observe today that are better explained by the YEC model than the traditional one.

This post is long enough as it is, and I haven’t even gotten to my evidence yet. Still I feel that this foundation was necessary before we could proceed. I will give my “evidences” starting next Wednesday, and continuing each Monday, Wednesday, and Friday until I am satisfied.  On Monday there will be a post about Horny Toads because of my prior Swiftocracy obligations. It should be fun.

Swiftocracy!: Movies out of Books


For the next two weeks each of my posts will be based off requests. For more information about how that happened, look here.

“Review books that have become movies, books that should be and what that would look like, and find a way to go on a rant! Also, anything else you would like to add on this subject!”

A science fiction and fantasy author by the name of Roger Zelazny supposedly had this to say to an aspiring writer who asked him for advice: “Tell a good story and all is forgiven.” That about sums up my current view of film adaptations of books.

I love books, and I love movies. Movies made from my favorite books should be right up my alley. I’m a very visual reader. I can see everything in my mind’s eye when reading. Because of this I used to believe, when I was young, that making movie adaptations of any particular book would be a fairly simple affair. Naturally there are some books, where almost nothing really happens besides inner conflict, that would make terrible movies. But the books I liked to read were usually less cerebral. I liked science fiction, and fantasy, and whatever I could get my hands on from the Scholastic book fair when it visited my school. I dived into Holes, Harry Potter, I Left My Sneakers in Dimension X, Artemis Fowl, Frindle, and Maniac McGee. I’d read anything I could get my hands on by Bruce Coville, Jerry Spinelli, Neal Shusterman, Louis Sacher, or Andrew Clements. Each of these unfolded in my mind like a film reel, only better because it convey smells, touch, and thought. Making them into movies would be easy. You just take what’s there in the books (though really I mean what I can see with my mind) and you film it. Simplicity itself.

When they made a movie version of Holes I was eager to see it. The Holes adaptation was pretty good. Rotten Tomatoes gave it a 77%, which is an admirable score. My brothers liked it. My friends liked it. If I watched it today, I’d like it.

But when it came out I hated it.

Sitting in the theatre my mind was full of objections. Stanly Yelnats is supposed to be fat! Where did this grandpa character come from? That’s not right! That’s not quite how it happened! Zero is black? (As it turns out, that was just a mistake on my part. The book never specifically says that Zero is black, but it doesn’t say white either and all of the little implications seem to indicate that he is definetly ethnic. Still, in my mind’s eye, Zero was a skinny white kid.)  I was outraged. How could they mess up the book so badly? I compared the movie on the screen to the one I had seen in my mind and it just didn’t match up. I was shocked to learn that Louis Sacher himself had worked with the filmmakers and gave the movie his seal of approval. How could he do that? They changed so many things!

Looking back on that I have to laugh at myself. The movie is actually quite accurate to the book by adaptation standards. They only changed a few elements and kept almost everything else the same. My problem was that I couldn’t understand why anything had to change at all.

After years of watching movies, reading books, and trying in my own clumsy way to create some of my own, I’ve learned better. The simple fact is that books and movies are different forms of media, and different mediums have different requirements. Movies are a visual and audible medium, while books are neither. They’re experienced in different ways. I can pick up a book, read it for a few minutes, put it down again, come back to it later, flip a few pages back, reread something, and put it back down again. Movies aren’t meant to be viewed like that. They’re meant to be watched from start to finish in one sitting. They’re different crafts and require different skills. The visual arts require a completely different set of skills than non-visual arts. They have their own needs, strengths, and weaknesses.

On top of that there are practical concerns. It’s unreasonable to find someone who both has good acting ability and looks exactly like the main character and has the name recognition to put people in the seats. A book may take hours or days to read but a movie needs to come in around two hours or nobody will want to watch it. In a book writing a scene that takes place on an alien planet with giant robot dinosaurs and crystalline aliens who occasionally explode takes exactly as much investment (that is, in time) as a scene where a lone woman sits in an empty room and cries. In a movie that first scene costs millions in special effects and will take months of work while that second scene can be done at a hundredth of the cost over the course of an afternoon. A movie needs a different kind of climax than a book. For example, in the last Twilight book the tension comes to a head when a bunch of powerful evil vampires face off against the good vampires and their allies. In the book everything builds up to this, and there is a lot of fear about who might die, whether there will be a fight at all, what will happen to their family, etc. The climax ends with the evil vampires deciding to leave after what amounts to a long and tense conversation. This works in a book; the conversation is tense, everything rides on it, etc. But in a movie it would be a flop. You can’t have people standing there and talking as the big third act climax. So when they made a movie out of it they actually showed a huge fight scene between the vampires with all kinds of craziness. I can’t blame them for this (and the way they pulled it off without totally going off the rails of the story was pretty clever). The book’s climax as it stood was unfilmable if you wanted it to be successful.

With all that in mind I began to wonder what the key to a succesful adaptation was. And that’s what brings us back to the quote. “Tell a good story and all is forgiven.” A movie can change almost as many details as it wants…provided that they actually make the movie better. Or at very least that the movie is a good one. Lord of the Rings is an almost perfect example of an excellent adaptation. The book was called “unfilmable” for good reason. It’s dense, it’s long, it requires a ton of backstory and exposition, there are too many characters, too many subplots, and too much going on for it to translate to film. But Peter Jackson did it. He did his best to keep the core of the story while streamlining it for filming. He added things, he changed things, he threw out a lot of stuff altogether. But in the end they are fantastic films and well loved by Tolkien fans. The majority of his changes made the film better. I like Tom Bombadil, but Jackson was right to cut him, the barrow wights, and Old Man willow right out. They would have made an already long movie longer, ruined the pacing, and were generally unimportant to the greater story. Now some changes didn’t work out so well (Frodo telling Sam to go home over lost bread? Are you serious?), but on the whole the trilogy works because they are good films executed well.

If you want to adapt a book you need to have two things as your focus. The first is that you must respect the original work. You must believe that the book contains a story in it that is worth telling. If you do then you must be committed to telling it well. Part of that is knowing that changes will have to be made.

Unless you’re one of the people behind the film adaptation of Eragon, in which case my advice to you is to never make an adaptation again. Also, thanks for ruining everything. I hope you’re happy.

Swiftocracy!: The History of Pencils (Involving High Tech Weaponry, Endangered Trees, and Chinese Emperors)


For the next two weeks each of my posts will be based off requests. For more information about how that happened, look here.

“The history of some completely mundane thing we use everyday.”

Let’s talk about pencils.

You know how they call the graphite inside of pencils “lead,” even though they’re made of graphite? When I was a boy  I was told that they used to use lead as the filling, but now a’ days we’re smarter and use graphite because it isn’t deadly poisonous.  That explanation turned out to be simultaneously completely false yet essentially true, all while muddling up the surprisingly fascinating history of the common pencil. To sort this out, let’s start from the beginning.

The word “pencil” is based off some kind of latin word that basically means paintbrush (look it up if you’re that into dead languages, I thought I’d boil down to the essentials on this one). However the precursor to the pencil is not a paintbrush at all but a tool known as the stylus that was popular in Roman times. The stylus was nothing more than a vaguely pencil shaped piece of metal. This piece of metal was used to put marks into tablets of wax or to scratch very light and hard to see words onto papyrus. Honestly the words written in wax were probably hard to see too, but if you wanted something permanent or lasting you wouldn’t be using a stylus. A stylus was for temporary jobs where quality wasn’t important, like jotting down a quick inventory of your goods, or doing some math. At some point the Romans, who loved using lead for just about anything, made lead styluses which had the added bonus of rubbing off on the material a little bit leaving a faint black mark. This is as close as we’ll get to an actual lead pencil.

People made do with styluses until somewhere between 1500 and 1560 when a fantastic discovery was made. And by “discovery” I mean “somebody with big ideas found something that the locals have known about for years and never thought was really that interesting.” In the small village of Seathwaite in England shepherds had taken to marking their sheep with some odd grey rocks they kept finding in the hills nearby. The rocks were made of graphite, and Seathwaite was (and remains to this very day) the only place on planet earth where deposits of pure graphite could be found. Seathwaite was sitting on an inexplicably pure and humongous deposit of a substance that nobody even knew existed until the 1500s. When it was properly discovered chemists at the time believed that it must be some strange variety of lead. Soon it was commonly known as “black lead,” which is why we call the graphite in our pencils “lead” to this day.

After its “discovery” people started properly mining it and sawing off big hunks of it to use as styluses. Graphite was vastly superior to lead as a simple marking tool, and far handier than ink for the writer or artist on the go. However graphite is really brittle and breaks easily (as anyone with a mechanical pencil can tell you) so it required some kind of covering to keep it together. The earliest pencils were square rods of graphite that were sawn off a big block and wrapped in string or sheepskin. Eventually somebody figured out that wood was a lot more convenient, and the modern pencil was born.

Artists, writers, and businesspeople everywhere rejoiced at the discovery of graphite and the invention of the pencil. However not long after the graphite mines were dug the English government took them over and strictly limited their output. You see graphite has properties besides being an excellent marking material. Metalworkers found that cannonball molds that were lined with graphite produced incredibly smooth cannonballs. Incredibly smooth cannonballs fire much farther and more accurately than those that are produced without the graphite lining. Britain was establishing itself as a major naval power at this time and they’d just been handed exclusive access to the material necessary for creating the most high-tech cannonballs in the world. They soon put the entire mine under guard. They were so security conscious, and so determined to prevent their enemies from getting pure graphite, that they would mine out as much as they would need for the next few years and then flood the entire mine. When they ran out they would pump out the water, mine some more, and then flood it again.

Enough graphite was released (or smuggled out) to support a small pencil industry. They were so popular that a method was devised to create solid graphite out of a mixture of graphite dust and various chemicals. Impure graphite deposits were found in Germany, and the Germans began selling to the rest of Europe (though their pencils were of far lower quality than the solid British versions). During the Napoleonic wars France found its pencil supplies cut off from both Britain and Germany, and devised a way to make graphite out of graphite powder and clay. Almost all graphite today is made using a similar method, as Seathwaite remains the only location where natural pure graphite can be found and the mines were played out there years ago.

Somewhere in all this pencils went from being square to round. British pencils were recognizable by still having a square core (since they were all sawed off of blocks of graphite) while other pencils had the round core that we’re used to today.

In the 1800’s American’s started making their own pencils so they wouldn’t have to import them. Making pencils was a slow process, and an American by the name of Ebenezer Wood sought to automate it. He came up with a lot of good ideas, but the one that has lasted the longest is the hexagon shaped pencils we’re most used to today. Hexagons could be cut out of wood with far less waste than circles, and the practice stuck.

Another pencil innovation from America was the discovery that Eastern Red Cedar was fantastic for making pencil casings. The wood doesn’t splinter easily, is durable, and smells nice. Soon Eastern Red Cedar was being exported to pencil manufacturers around the world. By the turn of the century Eastern Red Cedar was in such high demand and short supply that people began tearing apart old cedar barns to turn the wood into pencils. It got so bad that during WWII the British government outlawed pencil sharpeners because they wasted too much valuable wood and graphite. All pencil sharpening had to be done the economical way, with a knife. Eventually it was discovered that the Incense Cedar, a tree native to the mountains of California, worked just as well. Today most pencils are made with Incense Cedar wood, unless they’re the really cheap kind.

In 1858 somebody got the bright idea to attach an eraser to the back of a pencil. (INCIDENTALLY, erasers have an interesting history too. People used to use sandstone and pumice to erase pencil marks until it was discovered that breadcrumbs erased marks well. Then in 1770 Edward Naime claimed that he accidentally picked up a piece of rubber when he was reaching for a piece of bread and found it to be a superior eraser (this may or may not be true, as he did make his living as the one of the first eraser salesmen). Before this point nobody really had too much use for rubber; in fact, rubber got its name because of its ability to “rub out” pencil marks. Neat.)

It was around that same time that a huge deposit of high quality graphite was discovered in northern China and Siberia. By this time the Seathwaite deposit was almost used up, so “Chinese lead” was soon known to make the best pencils around. In the 1890s an Austrian company started painting its high quality pencils yellow to signify their luxury status and to make people think about China. Up until this point most pencils were unpainted, to show off the wood. Yellow was associated with royalty in China, in part because of the fabled “Yellow Emperor ” Huang Ti who in ancient times supposedly invented the bow and arrow, wooden carts, and writing. So yellow reminded Chinese of the Yellow Emperor and writing, which reminded of the western world of China, which made them think of all that high quality Chinese graphite. Competing pencil manufacturers released their own yellow painted luxury pencils. Soon everyone was painting their pencils yellow regardless of whether they had any Chinese graphite within them. Today a yellow pencil is as common as dirt, and usually signifies cheapness and mass production. Presumably if they had called Huang Ti the “Red Emperor” thousands of years ago we’d be up to our armpits in red pencils instead. History is funny like that.

So there you have it. From the Roman stylus to the modern Ticonderoga the pencil has a long and inexplicably interesting history. Where would the world be without it today?

Probably using pens, now that I think about it.

Swiftocracy!: I Cop Out Like a Weasel



For the next two weeks each of my posts will be based off requests. For more information about how that happened, look here.

“Logically and rationally defend something that you are very passionately opposed to.”


One of the keys to being a good debater (and arguably a well rounded person in general) is to understand that there are two sides to every issue. There are very rational and intelligent people out there who disagree with you about numerous topics of importance. It’s just the way things work. Some people, particularly those who have been sheltered from hearing opposing positions explained, have a problem with this. I’ve heard several accounts of people who were so shocked to discover that the people who disagreed with them actually had good reasons to do so that they began to doubt the veracity of all of their beliefs. This is a mistake, of course. There is a mindset that says that the right side of any debate will be immediately and always defendable against all arguments as long as the facts are known. This is incorrect. You can’t even definitively prove that anyone besides yourself exists; you can’t demonstrate beyond doubt that you have a brain. That’s just the way things are. That doesn’t mean that there isn’t a right side to a debate, but rather that we all must be comfortable with some level of doubt and that our opponents are not crazy or ignorant if they disagree with us.

Now the request presented to me was to logically and rationally defend something that I’m passionately opposed to. This is something I’m capable of doing. I spend a lot of time reading blogs run by people who disagree with me. I know the arguments. Some I even respect. And if the request had been for me to defends something I’m somewhat opposed to, or just generally opposed to, I would be fine.

But it said passionately opposed to. Very passionately.

And after thinking it over I just can’t do it.

The things I am passionately opposed to are things that I don’t just disagree with but that I think are actively harmful to humanity. Things that, if I could, I would wipe from the face of the earth. Things that make my blood boil and my hands tremble whenever I try to write about them. I know the arguments in favor of them. Many of them are good arguments (though I think they’re lacking on the whole). But I won’t defend them here. My conscience won’t let me do it. I know the internet. I’ll write up a post with a disclaimer at the top and bottom explaining that I’m actually opposed to these things and this is just an exercise, and those disclaimers will be ignored. These people will be just Googling around on the subject and they’ll find this post and they will find support for something I find immoral and abhorrent.

I just can’t do it. I really can’t. And the things I’m not that passionate about aren’t really that interesting. I’m not particularly passionate about politics. Or food. Or Mac vs. PC or whatever. I mean I’m a PC guy and I make fun of Mac users but really it’s a completely legitimate OS with it’s own unique strengths. If it was just myself and my friends shooting the breeze then I’d have no problem with this exercise. But this is the internet. This is the public sphere. I have very little influence on anyone, really. But what influence I have I feel the need to use wisely.

So yeah. Total cop out on my part here. Sorry about that.

Don’t mess with the passion!

Swiftocracy!: How Well Has the Church Provided?


For the next two weeks each of my posts will be based off requests. For more information about how that happened, look here.

“Where is the church with respect to the balance between feeding people emotionally, mentally, and physically–and what should be done about it?”

I’d like to begin by saying that I don’t think I’m qualified to answer this question. I’m not a pastor, minister, deacon, elder, priest, etc. I’m a layman, and I can only give a layman’s perspective on this question.

This question is a thorny one because it depends on what definition of “church” we are using. There are two general meanings involved in the word church. On the one hand is the Church, capital c (if you like), that represents all Christians everywhere. The bridegroom of Christ, his kingdom on this earth, the body of which Christ is the head, etc. The other general meaning is your local church or faith community, ie where you go to worship. This might be expanded to include an entire denomination or section of Christianity. Some of these denominations, or individuals within them, have different ideas about what the capital c Church looks like. Some extremely fundamentalist churches might hold that only members of their denomination are “true Christians.” Some extremely unorthodox churches might teach that all humans are part of the Church. Most denominations fall somewhere in the middle. I myself (and I don’t think I’m far from what traditional Christian orthodoxy teaches) believe that all who hold their hope in Christ are members of the church, whether they’re Baptists, Catholics, Eastern Orthodox, Coptic, Episcopalian, non-denominational, or come from a Stone-Campbell church like myself. I’ll leave it to God to decide which individuals are true Christians and which are not. It’s not up for me to judge a man’s heart (and thank goodness for that!).

This leaves me with a sticky problem in attempting to answer the question above. It would be foolish to talk about how my own church (the one I go to on Sundays, or even the broader denomination) feeds people emotionally, mentally, and physically. The focus there is very narrow, and only helpful to people who go to the same church I do. On the other hand, if I try to talk about how the Church performs in these respects I’m faced with the fact that not all parts of the Church do the same things in the same ways with the same effectiveness. There are good churches and bad churches but they’re all part of the Church, if you get what I’m saying. I’m forced to answer with either extreme narrowness (my own church) or extreme generalities (all churches everywhere).

Still, I can make a stab at it.

I’ve never found an atheist or even an anti-theist who didn’t concede that the Church provides emotional fulfillment and meets emotional needs. The emotional relevance of Christianity is so obvious and potent that it often takes the form of a criticism: that is, people only believe in God because the concept makes people feel better. My own church tradition is heavy with the power of personal testimonies. As such I grew up hearing over and over how people had come from places of deep depression, confusion, and self-absorption and professed to be saved by Christ. Though not all churches put such emphasis on personal experiences or testimonies I’d argue it’s hard to find churches that don’t feed people’s emotional needs. Perhaps some very harsh fundamentalist churches, or some particularly tired and dusty liturgical traditions. The Holy Spirit speaks to the heart.

Mentally is a different matter. I find that the Church has a long and storied tradition of scholarship, philosophy, and contemplation. The Church has much intellectual meat for a hungry mind to chew on. You need look no further than C.S. Lewis to see that, though I would encourage you to look much further and wider than that. G.K. Chesterton, Thomas Aquinas, Anselm, Augustine, Francis Shaeffer, Karl Barth, Dietrich Bonhoeffer, and many others make up a grand tradition of Christian intellectualism. However much of their work does seem closed off to the average churchgoer, though not by any policy or particular anti-intellectual movement. The simple fact is that these people must be sought out. Most churches do not ask that their members read much more than the Bible. I would like to see more individual churches introducing people to Christian philosophers and theologians. Few things make me sadder when I hear stories of people who left the Church because they had questions that others couldn’t, or wouldn’t, answer when there are answers to be had. Or, if not answers, then a long history of discussion and debate.  Complicating all of this is the fact that people have very different temperaments and needs. Some people find their mental needs are fulfilled by the average church service. Others are hungry for more. If one of the latter seeks out answers from one of the former it can lead to trouble. Sometimes I imagine that it would be useful if every church had a member who specialized in such matters. A person who anyone could go to with hard mental questions who could either answer them or give them the names of the authors who can. In most churches this is expected of the pastor (or priest, or minister, etc), but most pastors have a lot on their plate already, and not all are as intellectual as some might like. Really what needs to happen is that the Church (capital c) needs more intellectuals within it. We need more people everywhere who seek out answers and learn to use their minds, so that everyone may be benefitted by their knowledge. The Church has great storehouses of mental meat, but first we must choose to open up the pantry.

Finally there is the matter of the physical. In this theatre I believe that the Church has simultaneously done extraordinarily well and not nearly enough. According to the National Center for Charitable Statistics the top international relief organizations in the United States is World Vision, followed by Food for the Poor, both Christian nonprofits. Out of the top 5 international relief charities 3 are Christian. All over the world the Church has opened soup kitchens for the hungry, has sent food to those who are starving, has clothed those who are freezing, and has generally tried to help those in need. The Church has built thousands of hospitals, orphanages, schools, and asylums. Members of the Church have devoted their lives to helping those in need. I would argue that the Church has done more to help those in need than any other group on Earth.

On the other hand so much money is wasted and hoarded that is desperately needed elsewhere. Some megachurches spend millions on new buildings while sending mere thousands to starving refugees. Followers of the prosperity gospel believe that God would rather have them spend their excess cash on a luxury car then to help people who are dying of curable diseases because they can’t afford medicine. I will speak to American Christians in particular here. American Christians have access to an incredible amount of money. We have access to more wealth than any Christian group in the history of the world. Yet most American Christians give less than 10% of their income to help those in need. Some estimate that if American Christians who report that religion is very important to their lives gave 10% of their after tax income an additional $46 billion per year would be raised. Do you know how much good can be done with $46 billion? According to Richard Stearns, CEO of World Vision (who would be a position to know, considering the extent of their work) estimated that it would take about $70 billion to raise the lowest third of humanity out of extreme poverty. American Christians could bring two billion people who live on less than $2 a day into financial security in only two years if we wanted to. But American Christians don’t give that much. And when they do give much of it goes to better buildings, fancier programs, and the newest A/V equipment for the local church. The Church has done immense good in this world: but we have been, and still are, capable of doing so much more.

In the end the only recommendation I can make is on the individual level. The Church is made up of individual Christians trying to live out their faith. If the Church is to become better it must start with ourselves. We must learn to feed our brothers and sisters emotionally. We must educate ourselves and those around us in the great intellectual tradition of Christianity. We must use our own resources, our money, time, and effort, to meet the physical needs of all humanity. The change must begin and end with yourself.

Swiftocracy!: The Swift Have Spoken!

Fortunately for myself, six people heeded Wednesday’s call and took their place as noble Swiftocrats. I now have my blog topics for the next two weeks. They are as follows:

Monday the 16th: “Where is the church with respect to the balance between feeding people emotionally, mentally, and physically–and what should be done about it?” A noble request from High Emperor Swiftocraft (being the swiftest of the swift has its advantages) Debilis.

Wednesday the 18th: “Logically and rationally defend something that you are very passionately opposed to.” A needling request from Vice High Emperor Swiftocrat Neefu.

Friday the 20th:”The history of some completely mundane thing we use everyday” which I have chosen from a whole bushel of topics from Senior Low Emperor Swiftocraft Dysole. Dysole, if you want you can comment to choose exactly which mundane item you want me to write about. Otherwise I’ll sort it out myself.

Monday the 23rd: “Review books that have become movies, books that should be and what that would look like, and find a way to go on a rant! Also, anything else you would like to add on this subject!” An enthusiastic request by Junior Low Emperor Swiftocraft Webchickitty.

Wednesday the CHRISTMAS!: I forgot that  Christmas falls within the next two weeks so….yeah I’m going to take a break. No blogging on Christmas.

Friday the 27th: “Explain your top 5 most convincing evidences that the earth is relatively young,” requested by Grand Vizier Swiftocraft Seabeck.

Monday the 30th:”Horny Toads and their ability to defend themselves from predators,” a suitably random request by Grand High Senior Cabin Boy Swiftocraft Suckmywake.

I regret my foolish decision to surrender my own autonomy to a band of internet commenters. Still, the Swiftocracy has spoken, and I will heed their call.

I’m starting to miss coming up with my own ideas.

Swiftocracy!: A Chance to Rule My Blog


You may have noticed that the blog has been fairly dead lately. Posts are sporadic. My goal has been three posts a week and these days it’s more like one. Maybe. I put some of the blame for this on my job, which takes up a lot of time. But to be honest I really do have time to write blog posts. I just can’t think of anything to write about.

I mean I’ve had thoughts. I’ve thought about writing about video game criticism as a growing dicipline (but really, when was the last time I wrote about a video game? That’s not really what this blog is about), or about some of my therios about Neanderthalls and cold weather (like this blog needs a Communications major to theorize about biology, right), or even about how to make glass from scratch (interesting, but it doesn’t have to do with much of anything and since I’ve never tested it myself…). None of the ideas I get really work very well. It’s an issue.

You might recall my recent post about generating ideas. I stand by that post. I still think that the best way to generate posts is to work in limits. The problem is that I can’t even think up any limits.

That’s where you come in.

I’m starting a new feature called “Swiftocracy!,” as in, “Rule by the swift.” The first six people to comment get to dictate what I will write about for the next two weeks.  In your comment tell me what you want to hear me write about. It could be anything. You don’t even have to be specific. You could just write “BEAR CANDLES” and I’ll do my best to find something interesting about bears in regards to candles. In fact, the shorter and more limiting the writing prompt the better. This is your big chance. What do you want to hear me write about? C.S. Lewis? Young Earth Creationism? Movies? Bookbinding? Pad Thai? It can be anything you want. You have the power, but only if you are in the first six commenters. For the next two weeks this blog will be a swiftocracy.

(If you’ve never commented before then your comment may not show up right away, as I have to approve new commenters. Don’t worry, I’ll make sure to check the timestamps so the first six really are the first six. If this bumps one of the regular commenters out of the top six list, I’m sorry but you just weren’t swift enough. If your comment wants me to write about something obscene or hateful it will be trashed. Even swiftocracies have balances.)

Who will rule this blog for the next two weeks? Will they be fair? Just? Wise? I don’t know. But they will be fast.So get out there! The race starts now.!