The term “theory” is one of the most difficult to get across to students in the science classroom. We all kind of inherently know what it means, but putting it into practice or understanding how it influences everyday decisions and processes is very….very….difficult. Science defines “theory” in systematic terms (and no, folks, it does not equate with “educated guess”), but the reality is that everyone conducts their business and reacts to the particulars of their day under their own theoretical terms – although I’m willing to bet that almost no one realizes this. And few understand how critical a role “theory” plays in how we interpret facts and develop new ideas to move forward.
Consider how we approach the concept of natural resource management. The fundamental difference we see played out in the media, with management agencies, in political discourse and in the courts, is the difference between those who see humans as a part of the natural world and those who see humans as separate from it. The reasons for this dichotomy are legion and more ink spilled and trees cut to produce written statements on this issue than almost any other argument in human history. But the point is that any argument over specific management principles, laws, or even the definition of conservation essentially revolves around the tension between these two theoretical approaches.
One of the more difficult exercises we did in graduate school while reading reams of scientific literature was defining the theoretical perspective the writers were taking. Often it was clear. Sometimes it was convoluted and frequently you never knew where the authors were coming from or why they were taking the argument in the direction it was going. With regard to the issue of natural resource management, I’ll spare you from having to divine my theoretical perspective: I fully embrace the notion that humans are part of nature and not separate from it; therefore we absolutely cannot take a “hands-off” approach and assume everything will be just fine with regard to the natural world. Just because humans have evolved the ability to be self aware and can envision potential future consequences of current actions does not mean we are any less a part of the natural fabric than elk, wolves, chipmunks or paramecium. Therefore, management approaches must be developed with this overriding perspective.
So, I hear some people, including my own relatives, wonder how hunting and taking the lives of other living creatures can be an effective conservation measure. It somehow seems counter-intuitive to them – preserving animals by not killing them would seem more in tune with “conservation”. I see two fundamental flaws in this argument. First, it masks the more severe impacts to successful conservation by scapegoating hunters as the cause of wildlife decline. This view assumes that active management by humans is, by definition, outside natural processes and is counter productive to wildlife conservation. This is an assumption lacking scientific credibility. More importantly, this perspective ignores a much greater, albeit indirect, impact to wildlife posed by the loss of habitat, blocked migration corridors, disruption of ecosystem connectedness and a host of other interrupted biological processes brought about simply by the expansion of humans outside of urban centers. Building a house, paving a road, clearing trees for pasture, putting up fences, all contribute significantly to declining wildlife populations, even more so than hunting. As I pointed out in a previous post, anyone sincerely concerned with the viability of wildlife populations needs to first control the spread of humans and domesticated animals before blaming hunting. Which brings me to the second flaw in the preservation-only argument:
This theoretical perspective assumes that hunting cannot be beneficial to the long-term health of wildlife populations. However, it is most often hunting communities that halt the general expansion of humans across the landscape, or restore habitats beneficial to wildlife, or create economic incentives to sustain wildlife where otherwise it would be swept aside. My experiences in Africa and elsewhere demonstrated that the most ecologically tuned environments were those where hunting had a significant presence. While with the Hadza in Tanzania, I saw that the Lake Eyasi environments where they lived held far greater numbers and diversity of wildlife than adjacent areas dominated by agriculturists and other villagers. Hunters largely kept poachers out of the area and constantly manipulated the environment in ways conducive to wildlife but not domestic animals – the bane of wildlife viability. In South Africa, hunting provides a means to help feed locals and leads to the expansion of wildlife habitat. Where it’s done correctly, hunting provides incentives for people to stop expanding agricultural fields and pastures and instead reclaim the land for wildlife. And in North America it is hunters, not so-called “conservation” or animal protection organizations, who contribute the most funding and the most volunteer work toward the goal of increasing wildlife populations.
Like any good theory, the idea that active human management of the environment is now part of nature’s processes did not develop in a vacuum. I began to accept that idea the more I witnessed human behavior in a variety of contexts, studied how extinct human societies interacted and modified (for better or worse) the ecology around them, and saw the data that was more consistent with such a view. In the posts to come I’ll tell you of those experiences and show you some of the data that gives scientific credibility to active resource management.