Why it’s so hard to unravel the mysterious origins of domestic horses

File 20180222 152382 ln12dx.jpg?ixlib=rb 1.1
Infomastern/Flickr, CC BY-SA

Jan Hoole, Keele University

There’s still a lot we don’t know about how, and where, horses were first domesticated. Experts long thought that all modern horses were probably descended from a group of animals that belonged to the Botai culture, which flourished in Kazakhstan around 5,500 years ago.

But now, a new study published in Science suggests that the Botai horses were not the ancestors of our modern equine companions – and challenges what we thought we knew about one of the only “wild” horse species left today: the Przewalski’s horse.

There are now very few, if any, genuinely wild species of horse, which have never been domesticated. Scientists have known that Przewalski’s horse is not an ancestor of modern domestic horses, since studies were carried out on equine mitochondrial DNA in 2002. But now it seems that far from being the last remnants of a truly wild horse species, the Przewalski’s horse is the feral descendant of the domesticated Botai horses.

Study authors imagine what Przewalski’s ancient ancestors would have looked like.
Ludovic Orlando, Seas Goddard, Alan Outram., CC BY

Let’s take a look at the science.

Born wild?

Led by Charleen Gaunitz from the Natural History Museum of Denmark, the study’s 47 authors sequenced the genomes of 42 ancient horses from Kazakhstan and various sites in Eurasia, and compared them with published data from 46 ancient and modern horses.

Their analysis showed that Przewalski’s horse and the most ancient horses from Eurasia were not genetically similar, as might be expected. In fact, the Przewalski’s horse were found to be most closely related to the Botai horses, while all modern domesticated horses belong to a separate group. If this is right, it turns what we thought we knew about wild and domestic horses on its head.

But one of the difficulties of drawing conclusions from the DNA of a modern Przewalski’s horse, is that the species suffered a massive decline in the first half of the 20th century. The last one seen in the wild was spotted back in the 1960s, and it was declared extinct in the wild. A captive breeding programme began, and all of today’s Przewalski’s horses trace their ancestry back to 13 individuals, which were in zoos around the world at the time. Equus ferus przewalskii was reintroduced to the wild at the end of the 20th century.

Back where they belong: Przewalski’s horses in the Mongolian wilderness.
gsz/Flickr, CC BY-NC-ND

Gaunitz and her colleagues suggest that there has been considerable invasion of modern horse genes into the species. But the team were fortunate enough to have DNA from one specimen dating to the 19th century, before the population collapse occurred. This allowed them to show that the Botai horses were direct ancestors of another breed of horse from the early bronze age, called Borly4, and that these Borly4 horses were the direct ancestors of the pre-collapse Przewalski’s horse.

Unsolved mysteries

Who nose?
RPatts/Flickr, CC BY-NC-ND

This leaves the origins of modern horses shrouded in mystery. It seems they are descended from a completely different group of horses, but the genomic analysis suggests that they managed to interbreed with the Botai horses to a small degree as the population expanded across the continental landmass. The authors of the study suggest that Hungary, in Eastern Europe, might be one of a number of places where the ancestors of modern horses were first domesticated, because the oldest horse remains were recovered from there.

Earlier studies have suggested Iberia, North Africa and Eurasia as possible sites of domestication. And it seems likely that horses – like dogs – were independently domesticated in a number of different places and over a long period of time.

Scientists – and horse owners – often wonder exactly how horses were domesticated. It has been suggested that they were originally prey animals that humans began to protect and breed to ensure a steady supply of meat. Over time their keepers began to use them for milk, hides and transport. Alternatively, they may have been deliberately brought under human control to help with the hunting of wild horse herds.

Catch us if you can.
Brian395/Flickr, CC BY

Whatever the method, it now seems likely that the very robust horses of the Botai were not the ultimate ancestors of the delicate modern thoroughbred racehorse, nor of the heavy draft horses that were the staple workforce of agriculture in many parts of the world until the beginning of the 20th century.

The ConversationThe Botai horse genes are preserved only in the small and precarious populations of Przewalski’s horse, which struggle to survive in the areas of the Gobi desert and the mountain steppe regions of Mongolia where they were reintroduced. All the more reason then, to continue to ensure the survival of this species – possibly the last repository of ancient horse DNA.

Jan Hoole, Lecturer in Biology, Keele University

This article was originally published on The Conversation. Read the original article.

 

A new paper confidently claims that there are four giraffe species rather than one, but I’m not so sure

It is likely we have all heard the news ‘DNA studies have revealed there are actually 4 extant Giraffe species instead of one.’

Jerry Coyne, explains why this might not be so. It is a great article and the arguments he puts forth are quite solid. Enjoy!

Why Evolution Is True

The giraffe, Giraffa cameleopardalis, was first described by Linnaeus, and gets its species name from its fancied resemblance to a hybrid beast (as Wikipedia notes, the name comes from the Greek καμηλοπάρδαλις” meaning “giraffe”, from “κάμηλος” (kamēlos), “camel” + “πάρδαλις” (pardalis), “leopard”, due to its having a long neck like a camel and spots like a leopard). It’s always been considered a single species, but divided into about a half dozen subspecies that live in different areas and are distinguishable by different patterns of reticulation in their coats. Here’s an old subspecies designation and map; note that the populations included in each of the six subspecies live in different areas:

screen-shot-2016-09-11-at-7-22-35-am

Here’s a classification of nine subspecies based on pattern (the number of named subspecies has been between four and about nine (I haven’t searched extensively).

do-you-know-your-giraffes

Note that this classification is more or less arbitrary because the populations are geographically isolated and so…

View original post 2,032 more words

Wild Equus – Facebook newsfeed tweaks

Thanks to your continued support and interest, the Equilibre – Wild Equus  Facebook page is steadily but surely growing.

Facebook is constantly changing how it delivers content to your newsfeed. Typically, your default Facebook settings will only deliver 16% of our posts to you. Here are some easy steps to help you receive more posts from the Wild Equus page.

Step 1

If, for whatever reason, you have not joined our page yet, and wish to do so, click the the image below, then click on the “Like” button.

Wild Equus - Equilibre 2016-07-28 13-17-26

Step 2

Hover your cursor over the “Liked” button. A drop-down box should appear.The section “In your news feed” is typically set to default, click the option “See first” to prioritize posts from our page.

Step 3

Next, hover your cursor over the “Liked” button once again. In the section “Notifications”, click “All on” to receive notifications from Facebook when we post. You should now, in principle, receive notifications of our posts in your news feed.

Step 4

You can also add  Equilibre – Wild Equus to your “favorites”. While on our facebook page, hover your cursor over the button with the three dots (more options), to the right of the “Like” and “Message” buttons. Click on the button and in the drop-down click on “Add to favorites”.

The above tweaks should help your receive many more of our notifications. At the same time, it helps us reach more people. So, thanks again!

 

 

How people can live next to lions without killing them – new study

Grant Hopcraft, University of Glasgow

There is a sense of haunting to the roar of a lion veiled in darkness. The emphatic “ooooaa!” demands attention as it starts in the abdomen and reverberates through the night air. Its direction and distance are secondary to one’s primordial reaction – a sudden dilation of the pupils and a flare of prickles on the neck. The call unmistakably announces a large carnivore, yet as each roar fades into solitary grunts it feels less like an act of aggression than the lonely imploring of a lost soul in the darkness.

The plight of Africa’s lions is lamentable. Since the 1960s, the world has lost at least 70% of these magnificent cats, which until a few thousand years ago inhabited most of Europe, Asia and the Americas. Now we’re down to around 20,000, all of them in Africa apart from one sub-species in India. Habitat loss and the encroachment of people are largely responsible – lions in Kenya and Tanzania are shot by wildlife officials if they consistently kill livestock, for instance. And trophy hunters still shoot lions in the wild every year in countries where it is permitted, including Tanzania and Zimbabwe.

But if you were expecting a fable in which cunning Human steals from noble Lion, this story is not so clear-cut. Living with these predators is not easy. For many people in rural Africa, livestock pay for school fees and hospital bills, and insure against misfortune. Imagine finding half your nest egg has been taken overnight and, worse, worrying your family might be next. Unsurprisingly, many lions that live near people end up shot or poisoned. Yet it doesn’t have to be this way. A new five-year study that I have been involved in shows that when people directly benefit from lions, they become more tolerant of their faults.

Conservancies

We focused on an area surrounding the Mara National Reserve in Kenya, a protected zone at the northern extent of the Serengeti ecosystem. These fertile grasslands are the home of the Maasai, semi-nomadic pastoralists who share them with the great annual migration of over a million wildebeest and their predators, including lions.

Maasai tribeswoman.
Avatar_023

Maasai have always speared any rogue that dares interfere with their livestock; and today there are far more people and livestock and much less space for lions. Yet many on the northern edge of the Mara have wisely noted the premiums that tourists are prepared to pay for the Serengeti experience.

Lions help attract over 350,000 visitors to the area every year, generating $90 million (£63 million) in entrance fees alone. Beyond the national reserve, many families have combined landholdings into community conservancies which welcome visitors for a fee. They attract wildlife by managing and protecting resources such as livestock, water and unique habitats; and they distribute income fairly around the community to avoid feuds. Other families have declined this opportunity, relying purely on their livestock for income.

Grant Hopcraft

Hence there is a 1,500sqkm patchwork of conservancies and other privately owned pastureland to the north of the Mara National Reserve. Together with the reserve itself, where no one lives and lions can roam freely, it amounts to a perfect three-way natural experiment to investigate the effects of conservancies on lions. Lead author Sara Blackburn and Laurence Frank, a veteran predator biologist, spent five years observing the lifespan of 382 lions in the area. This is the first time anyone has looked at the survival rates of individual lions in relation to conservancies, rather than just counting them.

The natural life expectancy of a lion living in the wild rarely exceeds 13 years. When we compared the survival of lions living outside national parks, our results consistently showed that survival is not determined by how many prey are available or the quality of the habitat – there are enough of both to sustain this population. The number of livestock in a lion’s territory makes no difference either.

The only factor that consistently cuts short a lion’s life, sometimes lowering the chances of survival by as much as 40%, is the number of homesteads in its territory that are not part of a community conservancy. Homesteads that are members of a conservancy, on the other hand, have no negative effect on lions’ survival chances. This suggests that when people receive income from lions via ecotourism, they become tolerant and lions survive. There is a good chance that the same would also be true for other animals that are declining across the region, such as giraffe and impala.

Next steps

Cecil, the lion shot by an American hunter last year, drew a line in the sand regarding the public’s opinion on conservation of this remarkable predator. These events have sparked heated debates about the role of trophy hunting and using fences to protect lions in the wild, even while recent footage of an agitated lion walking the streets of Nairobi highlights the continued struggle for space.

In this worrying context, our research points to how this story can end more happily. Community conservancies are a viable and working alternative to protecting wildlife. Although they exist in many parts of Kenya and Tanzania, we must continue encouraging governments to develop similar opportunities for local communities to benefit from wildlife through ecotourism. Evidence such as ours gives reason to be optimistic that community conservancies will continue to expand and benefit human and lion alike.

Sara Blackburn, an MSc student in biodiversity and conservation, assisted in the writing of the piece

The Conversation

Grant Hopcraft, Research Fellow, University of Glasgow

This article was originally published on The Conversation. Read the original article.

Thinking critically on critical thinking: why scientists’ skills need to spread

Rachel Grieve, University of Tasmania

MATHS AND SCIENCE EDUCATION: We’ve asked our authors about the state of maths and science education in Australia and its future direction. Today, Rachel Grieve discusses why we need to spread science-specific skills into the wider curriculum.

When we think of science and maths, stereotypical visions of lab coats, test-tubes, and formulae often spring to mind.

But more important than these stereotypes are the methods that underpin the work scientists do – namely generating and systematically testing hypotheses. A key part of this is critical thinking.

It’s a skill that often feels in short supply these days, but you don’t necessarily need to study science or maths in order gain it. It’s time to take critical thinking out of the realm of maths and science and broaden it into students’ general education.

What is critical thinking?

Critical thinking is a reflective and analytical style of thinking, with its basis in logic, rationality, and synthesis. It means delving deeper and asking questions like: why is that so? Where is the evidence? How good is that evidence? Is this a good argument? Is it biased? Is it verifiable? What are the alternative explanations?

Critical thinking moves us beyond mere description and into the realms of scientific inference and reasoning. This is what enables discoveries to be made and innovations to be fostered.

For many scientists, critical thinking becomes (seemingly) intuitive, but like any skill set, critical thinking needs to be taught and cultivated. Unfortunately, educators are unable to deposit this information directly into their students’ heads. While the theory of critical thinking can be taught, critical thinking itself needs to be experienced first-hand.

So what does this mean for educators trying to incorporate critical thinking within their curricula? We can teach students the theoretical elements of critical thinking. Take for example working through statistical problems like this one:

In a 1,000-person study, four people said their favourite series was Star Trek and 996 said Days of Our Lives. Jeremy is a randomly chosen participant in this study, is 26, and is doing graduate studies in physics. He stays at home most of the time and likes to play videogames. What is most likely?

  1. Jeremy’s favourite series is Star Trek
  2. Jeremy’s favourite series is Days of Our Lives

Some critical thought applied to this problem allows us to know that Jeremy is most likely to prefer Days of Our Lives.

Can you teach it?

It’s well established that statistical training is associated with improved decision-making. But the idea of “teaching” critical thinking is itself an oxymoron: critical thinking can really only be learned through practice. Thus, it is not surprising that student engagement with the critical thinking process itself is what pays the dividends for students.

As such, educators try to connect students with the subject matter outside the lecture theatre or classroom. For example, problem based learning is now widely used in the health sciences, whereby students must figure out the key issues related to a case and direct their own learning to solve that problem. Problem based learning has clear parallels with real life practice for health professionals.

Critical thinking goes beyond what might be on the final exam and life-long learning becomes the key. This is a good thing, as practice helps to improve our ability to think critically over time.

Just for scientists?

For those engaging with science, learning the skills needed to be a critical consumer of information is invaluable. But should these skills remain in the domain of scientists? Clearly not: for those engaging with life, being a critical consumer of information is also invaluable, allowing informed judgement.

Being able to actively consider and evaluate information, identify biases, examine the logic of arguments, and tolerate ambiguity until the evidence is in would allow many people from all backgrounds to make better decisions. While these decisions can be trivial (does that miracle anti-wrinkle cream really do what it claims?), in many cases, reasoning and decision-making can have a substantial impact, with some decisions have life-altering effects. A timely case-in-point is immunisation.

Pushing critical thinking from the realms of science and maths into the broader curriculum may lead to far-reaching outcomes. With increasing access to information on the internet, giving individuals the skills to critically think about that information may have widespread benefit, both personally and socially.

The value of science education might not always be in the facts, but in the thinking.

This is the sixth part of our series Maths and Science Education.

The Conversation

Rachel Grieve, Lecturer in Psychology, University of Tasmania

This article was originally published on The Conversation. Read the original article.

Identification of animals and plants is an essential skill set

Susan Lawler, La Trobe University

I have recently been made abundantly aware of the lack of field skills among biology students, even those who major in ecology. By field skills we mean the ability to identify plants and animals, to recognise invasive species and to observe the impact of processes such as fire on the landscape.

My colleague Mike Clarke calls it “ecological illiteracy”, and identifies it as a risk for nature at large. While people spend more times indoors in front of screens, we become less aware of the birds, plants and bugs in our backyards and neighbourhoods. This leads to an alienation of humans from nature that is harmful to our health, our planet and our spirit.

On a more practical, academic level, I was in a meeting this week where an industry representative complained that biology graduates are no longer able to identify common plants and animals. This limits their employment prospects and hampers the capacity of society to respond to changes in natural ecosystems predicted by climate change.

Field taxonomy vs. Bloom’s taxonomy

So what is going on? Why don’t ecology students get this information during the course of their University degrees?

Practical sessions teaching scientific names of animals or plants can be perceived to be boring and dry. Students may be asked to collect and pin a range of insects or press and identify certain plants as part of their training in biological diversity, but these activities are time consuming and expensive. As we strive to be more flexible and efficient, classes and assessments relying on identification skills are quickly dropped.

Ironically, the dogma that has been so detrimental to field taxonomy is known as Bloom’s taxonomy. University lecturers are told to apply an educational theory developed by Benjamin Bloom, which categorises assessment tasks and learning activities into cognitive domains. In Bloom’s taxonomy, identifying and naming are at the lowest level of cognitive skills and have been systematically excluded from University degrees because they are considered simplistic.

The problem is that identifying a plant or insect is not simple at all. Not only do you need to know which features to examine (nuts, leaves, roots, spines, eye stripes or wing venation), you need to adopt a whole vocabulary of terms designed to provide precision in the observation of specific traits. Examining the mouthparts of insects requires knowing the difference between a mandible, maxilla and rostrum. Hairs on a leaf can be described as glaucous, glabrous, or hirsute.

Such detail cannot be taught without a student passionate enough to embrace the task and having a passionate mentor who can make the discipline come alive.

Photographs are not enough

In this digital age some people seem to think that photographs can replace the collection of specimens. I know a bit about crayfish, and where in the past a fisher might show up with an animal in an esky, these days people like to send me a photo and ask what species that was. I cannot identify a crayfish from a photo, nor can I easily explain to an interested amateur how to count the mesal carpal spines.

There is a reason that scientists must collect specimens and take them back to the lab or lodge them with a museum. Biological organisms are extremely complex, and the critical feature that distinguishes one from another relies on careful comparison.

A recent discovery of a rare kingfisher in Guadalcanal caused controversy in the Washington Post when the researchers photographed, then killed and collected the animal. I understand why they felt they needed to document their finding with a specimen, and I understand the outrage of nature lovers who decry the need for more than a photo.

Australian species are poorly known

A recent article about the loss of field skills in Britain claims that there is no excuse now that there are so many complete field guides available. The author argues that in the United Kingdom, the golden age of biological recording is over.

It is true that in some parts of the world the species have all been named and catalogued, but Australia is not one of those places. Any shake of a shrub will produce un-named insects. Every Bush Blitz expedition discovers new species or new records of known species.

Young people need field trips

I spent last week in the Victorian alps with biology students from La Trobe University. As part of their research project they needed to identify plants and insects. We had some impressive expertise among our staff, people who knew the Latin names of every plant at first glance. The trick is to transmit that knowledge to the next generation.

Accordingly, we made the students tape leaves into their notebooks and write names next to each one. We brought the insects back to the lodge and sat in front of microscopes for hours. Using keys, identification books and each other we were able to describe the particular community at each study site.

Some of the students came away excited about different groups of organisms. The excitement of the camp may lead them to spend time away from their desks staring at gum leaves, listening for bird calls or popping bugs in jars for later inspection.

I hope that some of them becom obsessed enough to turn themselves into experts, but I also want all young people to have more exposure to nature and all of its parts.

Not everyone can spend time in the alps, but everyone can learn the names of the trees in a nearby park. Can you identify the birds calling in your backyard? Do you know the difference between a moth and a butterfly, or between a worm and a grub?

Take the time to engage with both the little and big things growing around you and discover the joy of re-connecting with nature.

The Conversation

Susan Lawler, Senior Lecturer, Department of Ecology, Environment and Evolution, La Trobe University

This article was originally published on The Conversation. Read the original article.

Naturalists are becoming an endangered species

David Norman, University of Cambridge

The phrase “Natural History” is linked in most people’s minds today with places that use the phrase: the various Natural History Museums, or television programmes narrated so evocatively by renowned naturalist Sir David Attenborough.

As times have changed, used in its traditional sense the phrase now has an almost archaic ring to it, perhaps recalling the Victorian obsession with collecting butterflies or beetles, rocks or fossils, or stuffed birds and animals, or perhaps the 18th century best-seller, Gilbert White’s The Natural History of Selborne.

Once natural history was part of what was equally archaically called natural philosophy, encompassing the enquiry into all aspects of the natural world that we inhabit, from the tiniest creature to the largest, to molecules and materials, to planets and stars in outer space. These days, we call it science. Natural history specifically strives to study and understand organisms within their environment, which would these days equate to the disciplines of ecology or conservation.

In a recent article in the journal BioScience, a group of 17 scientists decry what they see as a shift away from this traditional learning (once typical parts of biology degrees) that taught students about organisms: where they live, what they eat, how they behave, their variety and relationships to their ecosystems in which they live.

Partly by the promise of a course-specific career, and perhaps partly because of poorly taught courses that can emphasise rote learning, students are enticed into more exciting fields such as biotechnology or evolutionary developmental biology (“evo-devo”), where understanding an organism is less important than understanding the function of a particular organ or limb.

But their challenge is not simply a revolt against the new: they note that 75% of infectious diseases that affect humans, such as bird flu, cholera and rabies, have links to other animals at some point in their lifecycle. By understanding more about these other creatures and how the disease affects them – their natural history – we are better placed to tackle the effects of the disease on us.

The authors make the same argument for other fields of study: useful sustainable agricultural practices such as companion planting, crop rotation, and pest control are making a comeback after having been largely discarded under modern farming methods. A more thorough understanding of fish lifecycles and surveys of their population could prevent disastrous fishery collapse, such as happened with walleye pollock in the Bering Sea. And the decision to rigorously suppress forest fires in the western US – on the basis of forest management principles imported from different places with different species – now costs US$1 billion a year, a cost that could have been avoided if the important natural role of fire in the ecosystem had been recognised.

It is a fact that at university level we produce fewer and fewer field-based biologists – experts who know and understand the range and variety of creatures found in real world environments, and how they live. Everything is becoming increasingly modular and based on computer models in labs as biology becomes more technologically driven.

Many of our greatest natural historians probably caught the bug precisely because of fieldwork excursions at school – the wonders of nature seen in samples taken after pond-dipping, for example. How often does that happen now? (Due to health and safety restrictions, curriculum pressures on teachers, or the absence of staff with sufficient knowledge or interest, or indeed all three, it may not happen at all.)

Scientists increasingly want to analyse data to look for trends, connections, and signs of the bigger picture. But where does this data come from? From field collections. And crucially, this data is dependent upon the fidelity of records – such as those held in such natural history museums, no less. Who are or were these collectors, what is their level of expertise, how reliable their record-keeping? And who assesses the quality of such data?

The answer depends on whether such expertise is valued, intellectually and economically, because only then will career paths emerge that will attract the future Darwins, Huxleys, Tinbergens, Fords, Southwoods, Krebs and Davies. Sir David Attenborough and Gerald Durrell have done wonderful jobs to pique our interest (and our conscience) when it comes to the natural world, and the BBC’s programmes such as Springwatch also play their part. But where are the careers for the next generation?

The authors aim to provoke and have a serious point to make. But theirs is a small voice within a deafened and distracted world. We need to realise that it is not just the data-wranglers, modellers and manipulators, but genuine expertise and enthusiasm that is needed on the ground. Keen amateur natural historians are fine, but they are not a career path for today’s future scientists. Only by expending more effort (and money) on ensuring there is training and careers for the next natural historians and biologists can we be sure that we can rely on their expertise to guide future conservation, management or policy decisions that could avoid us repeating past mistakes.

Without new naturalists, our museums with their extraordinarily rich collections of creatures and much more besides will appear as increasingly archaic, pointless stockpiles that fewer and fewer people appreciate.

The Conversation

David Norman, Reader in Paleobiology, Curator of Palaeontology, Sedgwick Museum of Earth Sciences, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Darwin’s finches highlight the unity of all life

Frank Nicholas, University of Sydney

When Charles Darwin visited the Galapagos Islands in October 1835, he and his ship-mates on board HMS Beagle collected specimens of birds, including finches and mockingbirds, from various islands of the archipelago.

At the time, Darwin took little interest in the quaint finches, making only a one-word mention of them in his diary. As painstakingly shown by Frank Sulloway and more recently by John Van Whye, it wasn’t until two years later that the finches sparked Darwin’s interest.

By then he had received feedback from the leading taxonomist of the time, John Gould, that the samples comprised 14 distinct species, none of which had been previously described! Gould also noted that their “principal peculiarity consisted in the bill [i.e. beak] presenting several distinct modifications of form”.

So intrigued was Darwin by this variation in size and shape of beaks that in the second (1845) edition of Journal of Researches he included illustrations of the distinctive variation between species in the size and shape of their beaks. He added a comment that:

Seeing this gradation and diversity of structure in one small, intimately related group of birds, one might really fancy that from an original paucity of birds in this archipelago, one species had been taken and modified for different ends.

The famously varied beak shapes of the Galapagos finches, as illustrated in the second edition of Darwin’s Journal of Researches.
Wikimedia

Unfortunately for Darwin, the closer he examined the available evidence on Galapagos finches, the more confusing the picture became. This was partly because the specimens available to him were not sufficiently labelled as to their island of collection.

Presumably, it was his doubt about the available evidence that resulted in Darwin making no mention of Galapagos finches in any edition of Origin of Species.

Why, then, do people now label them as “Darwin’s finches”, and why are these finches now regarded as a classical textbook example of his theory of evolution by natural selection?

Paragons of evolution

Despite not mentioning Galapagos finches, Darwin did make much use of evidence from other Galapagos species (especially mockingbirds) in Origin of Species.

As the influence of Origin of Species spread, so too did the evolutionary fame of the Galapagos Islands. Increasingly, other biologists were drawn into resolving the questions about finches that Darwin had left unanswered.

By the end of the 19th century, Galapagos finches were among the most studied of all birds. By the mid-20th century, there was abundant evidence that Galapagos finches had evolved to fill the range of ecological niches available in the archipelago – a classic example of evolution by adaptive radiation.

Beak size and shape were key attributes in determining adaptation to the different types of food available. In the second half of the 20th century, classic research by Princeton University’s Peter and Rosemary Grant provided evidence of quite strong natural selection on beak size and shape.

Under the hood

New light has also been shed on the evolution of Darwin’s finches in a paper recently published in Nature. In this latest research, the entire genomes of 120 individual birds from all Galapagos species plus two closely related species from other genera were sequenced.

The work was done by a team led by Swedish geneticist Leif Andersson, with major input from Peter and Rosemary Grant, who are still leading experts on the finches.

Comparison of sequence data enabled them to construct a comprehensive evolutionary tree based on variation across the entire finch genome. This has resulted in a revised taxonomy, increasing the number of species to 18.

The most striking feature of the genome-based tree is the evidence for matings between different populations, resulting in the occasional joining of two branches of the tree. This evidence of “horizontal” gene flow is consistent with field data on matings of finches gathered by the Grants.

A comparison of whole-genome sequence between two closely related groups of finches with contrasting beak shape (blunt versus pointed) identified at least 15 regions of chromosomes where the groups differ substantially in sequence.

Unity of life

The most striking difference between the two groups was observed in a chromosomal region containing a regulatory gene called ALX1. This gene encodes a polypeptide that switches other genes on and off by binding to their regulatory sequences.

Like other such genes, ALX1 is crucially involved in embryonic development. Indeed, mutations in ALX1 in humans and mice give rise to abnormal development of the head and face.

It is an extraordinary illustration of the underlying unity of all life on Earth that Leif Andersson and his colleagues have shown that the ALX1 gene also has a major effect on beak shape in finches, and that this gene has been subject to natural selection during the evolution of the Galapagos finches.

If Darwin were alive today, he would be astounded at the power of genomics tools such as those used in generating the results described in this paper. He would also be delighted to see such strong evidence not only in support of evolution but also in support of one of its major forces, natural selection.

The Conversation

Frank Nicholas, Emeritus Professor of Animal Genetics, University of Sydney

This article was originally published on The Conversation. Read the original article.

Best wishes for 2016 from Wild Equus Network!

In June 2015, the Wild Equus Network (WEN) was founded as a global network of like-minded people committed to the welfare of domestic equines, and the conservation of wild and free-roaming equids around the world.

After only six months, we are happy to announce that the WEN has grown.

The number of  passionately committed people that have joined the WEN either as Ambassadors, Researchers, Photographers or Sponsors, is overwhelming. We are grateful and humbled by your continued support!

Through you, the WEN gains extensive knowledge and experience, and at the same time is inspired to strive to become a resonating voice for wild, free-ranging, and domestic equids worldwide.

May your 2016 be even better than 2015!!!
The Wild Equus Network wishes you all a happy and fruitful 2016!!!

 

 

 

 

Setting aside half the Earth for ‘rewilding’: the ethical dimension

William Lynn, Clark University

A much-anticipated book in conservation and natural science circles is EO Wilson’s Half-Earth: Our Planet’s Fight for Life, which is due early next year. It builds on his proposal to set aside half the Earth for the preservation of biodiversity.

The famous biologist and naturalist would do this by establishing huge biodiversity parks to protect, restore and connect habitats at a continental scale. Local people would be integrated into these parks as environmental educators, managers and rangers – a model drawn from existing large-scale conservation projects such as Area de Conservación Guanacaste (ACG) in northwestern Costa Rica.

The backdrop for this discussion is that we are in the sixth great extinction event in earth’s history. More species are being lost today than at any time since the end of the dinosaurs. There is no mystery as to why this is happening: it is a direct result of human depredations, habitat destruction, overpopulation, resource depletion, urban sprawl and climate change.

Wilson is one of the world’s premier natural scientists – an expert on ants, the father of island biogeography, apostle of the notion that humans share a bond with other species (biophilia) and a herald about the danger posed by extinction. On these and other matters he is also an eloquent writer, having written numerous books on biodiversity, science, and society. So when Wilson started to talk about half-Earth several years ago, people started to listen.

As a scholar of ethics and public policy with an interest in animals and the environment, I have been following the discussion of half-Earth for some time. I like the idea and think it is feasible. Yet it suffers from a major blind spot: a human-centric view on the value of life. Wilson’s entry into this debate, and his seeming evolution on matters of ethics, is an invitation to explore how people ought to live with each other, other animals and the natural world, particularly if vast tracts are set aside for wildlife.

The ethics of Wilson’s volte-face

I heard Wilson speak for the first time in Washington, DC in the early 2000s. At that talk, Wilson was resigned to the inevitable loss of much of the world’s biodiversity. So he advocated a global biodiversity survey that would sample and store the world’s biotic heritage. In this way, we might still benefit from biodiversity’s genetic information in terms of biomedical research, and perhaps, someday, revive an extinct species or two.

Not a bad idea in and of itself. Still, it was a drearily fatalistic speech, and one entirely devoid of any sense of moral responsibility to the world of nonhuman animals and nature.

What is striking about Wilson’s argument for half-Earth is not the apparent about-face from cataloging biodiversity to restoring it. It is the moral dimension he attaches to it. In several interviews, he references the need for humanity to develop an ethic that cares about planetary life, and does not place the wants and needs of a single species (Homo sapiens sapiens) above the well-being of all other species.

The half-Earth proposal prompts people to consider the role of humans in nature.
jene/flickr, CC BY-NC-ND

To my ear, this sounds great, but I am not exactly sure how far it goes. In the past, Wilson’s discussions of conservation ethics appear to me clearly anthropocentric. They espouse the notion that we are exceptional creatures at the apex of evolution, the sole species that has intrinsic value in and of ourselves, and thus we are to be privileged above all other species.

In this view, we care about nature and biodiversity only because we care about ourselves. Nature is useful for us in the sense of resources and ecological services, but it has no value in and of itself. In ethics talk, people have intrinsic value while nature’s only value is what it can do for people – extrinsic value.

For example, in his 1993 book The Biophilia Hypothesis, Wilson argues for “the necessity of a robust and richly textured anthropocentric ethics apart from the issues of rights [for other animals or ecosystems] – one based on the hereditary needs of our own species. In addition to the well-documented utilitarian potential of wild species, the diversity of life has immense aesthetic and spiritual value.”

The passage indicates Wilson’s long-held view that biodiversity is important because of what it does for humanity, including the resources, beauty and spirituality people find in nature. It sidesteps questions of whether animals and the rest of nature have intrinsic value apart from human use.

His evolving position, as reflected in the half-Earth proposal, seems much more in tune with what ethicist call non-anthropocentrism – that humanity is simply one marvelous but no more special outcome of evolution; that other beings, species and/or ecosystems also have intrinsic value; and that there is no reason to automatically privilege us over the rest of life.

Consider this recent statement by Wilson:

What kind of a species are we that we treat the rest of life so cheaply? There are those who think that’s the destiny of Earth: we arrived, we’re humanizing the Earth, and it will be the destiny of Earth for us to wipe humans out and most of the rest of biodiversity. But I think the great majority of thoughtful people consider that a morally wrong position to take, and a very dangerous one.

The non-anthropocentric view does not deny that biodiversity and nature provide material, aesthetic and spiritual “resources.” Rather, it holds there is something more – that the community of life has value independent of the resources it provides humanity. Non-anthropocentric ethics requires, therefore, a more caring approach to people’s impact on the planet. Whether Wilson is really leaving anthropocentrism behind, time will tell. But for my part, I at least welcome his opening up possibilities to discuss less prejudicial views of animals and the rest of nature.

The 50% solution

It is interesting to note that half-Earth is not a new idea. In North America, the half-Earth concept first arose in the 1990s as a discussion about wilderness in the deep ecology movement. Various nonprofits that arose out of that movement continued to develop the idea, in particular the Wildlands Network, the Rewilding Institute and the Wild Foundation.

These organizations use a mix of conservation science, education and public policy initiatives to promote protecting and restoring continental-scale habitats and corridors, all with an eye to preserving the native flora and fauna of North America. One example is ongoing work to connect the Yellowstone to Yukon ecosystems along the spine of the Rocky Mountains.

Take it up a notch? The British Columbia Ministry of Transportation recently started to add signs warning motorists when they are likely to encounter wildlife.
British Columbia Ministry of Transportation, CC BY-NC-ND

When I was a graduate student, the term half-Earth had not yet been used, but the idea was in the air. My classmates and I referred to it as the “50% solution.” We chose this term because of the work of Reed Noss and Allen Cooperrider’s 1994 book, Savings Nature’s Legacy. Amongst other things, the book documents that, depending on the species and ecosystems in question, approximately 30% to 70% of the original habitats of the Earth would be necessary to sustain our planet’s biodiversity. So splitting the difference, we discussed the 50% solution to describe this need.

This leads directly into my third point. The engagement of Wilson and others with the idea of half-Earth and rewilding presupposes but does not fully articulate the need for an urban vision, one where cities are ecological, sustainable and resilient. Indeed, Wilson has yet to spell out what we do with the people and infrastructure that are not devoted to maintaining and teaching about his proposed biodiversity parks. This is not a criticism, but an urgent question for ongoing and creative thinking.

Humans are urbanizing like never before. Today, the majority of people live in cities, and by the end of the 21st century, over 90% of people will live in a metropolitan area. If we are to meet the compelling needs of human beings, we have to remake cities into sustainable and resilient “humanitats” that produce a good life.

Such a good life is not to be measured in simple gross domestic product or consumption, but rather in well-being – freedom, true equality, housing, health, education, recreation, meaningful work, community, sustainable energy, urban farming, green infrastructure, open space in the form of parks and refuges, contact with companion and wild animals, and a culture that values and respects the natural world.

To do all this in the context of saving half the Earth for its own sake is a tall order. Yet it is a challenge that we are up to if we have the will and ethical vision to value and coexist in a more-than-human world.

The Conversation

William Lynn, Research Scientist in Ethics and Public Policy, Clark University

This article was originally published on The Conversation. Read the original article.