Dogs are hypersocial with humans, and their integration into human social ecology makes dogs a unique model for studying cross-species social bonding. However, the proximal neural mechanisms driving dog-human social interaction are unknown. We used fMRI in 15 awake dogs to probe the neural basis for their preferences for social interaction and food reward. In a first experiment, we used the ventral caudate as a measure of intrinsic reward value and compared activation to conditioned stimuli that predicted food, praise, or nothing. Relative to the control stimulus, the caudate was significantly more active to the reward-predicting stimuli and showed roughly equal or greater activation to praise versus food in 13 of 15 dogs. To confirm that these differences were driven by the intrinsic value of social praise, we performed a second imaging experiment in which the praise was withheld on a subset of trials. The difference in caudate activation to the receipt of praise, relative to its withholding, was strongly correlated with the differential activation to the conditioned stimuli in the first experiment. In a third experiment, we performed an out-of-scanner choice task in which the dog repeatedly selected food or owner in a Y-maze. The relative caudate activation to food- and praise-predicting stimuli in Experiment 1 was a strong predictor of each dog’s sequence of choices in the Y-maze. Analogous to similar neuroimaging studies of individual differences in human social reward, our findings demonstrate a neural mechanism for preference in domestic dogs that is stable within, but variable between, individuals. Moreover, the individual differences in the caudate responses indicate the potentially higher value of social than food reward for some dogs and may help to explain the apparent efficacy of social interaction in dog training.
Spiders have been suspected to be one of the most important groups of natural enemies of insects worldwide. To document the impact of the global spider community as insect predators, we present estimates of the biomass of annually killed insect prey. Our estimates assessed with two different methods suggest that the annual prey kill of the global spider community is in the range of 400-800 million metric tons (fresh weight), with insects and collembolans composing >90% of the captured prey. This equals approximately 1‰ of the global terrestrial net primary production. Spiders associated with forests and grasslands account for >95% of the annual prey kill of the global spider community, whereas spiders in other habitats are rather insignificant contributors over a full year. The spider communities associated with annual crops contribute less than 2% to the global annual prey kill. This, however, can be partly explained by the fact that annual crop fields are “disturbed habitats” with a low buildup of spider biomass and that agrobiont spiders often only kill prey over short time periods in a year. Our estimates are supported by the published results of exclusion experiments, showing that the number of herbivorous/detritivorous insects and collembolans increased significantly after spider removal from experimental plots. The presented estimates of the global annual prey kill and the relative contribution of spider predation in different biomes improve the general understanding of spider ecology and provide a first assessment of the global impact of this very important predator group.
The century-old idea that stripes make zebras cryptic to large carnivores has never been examined systematically. We evaluated this hypothesis by passing digital images of zebras through species-specific spatial and colour filters to simulate their appearance for the visual systems of zebras' primary predators and zebras themselves. We also measured stripe widths and luminance contrast to estimate the maximum distances from which lions, spotted hyaenas, and zebras can resolve stripes. We found that beyond ca. 50 m (daylight) and 30 m (twilight) zebra stripes are difficult for the estimated visual systems of large carnivores to resolve, but not humans. On moonless nights, stripes are difficult for all species to resolve beyond ca. 9 m. In open treeless habitats where zebras spend most time, zebras are as clearly identified by the lion visual system as are similar-sized ungulates, suggesting that stripes cannot confer crypsis by disrupting the zebra’s outline. Stripes confer a minor advantage over solid pelage in masking body shape in woodlands, but the effect is stronger for humans than for predators. Zebras appear to be less able than humans to resolve stripes although they are better than their chief predators. In conclusion, compared to the uniform pelage of other sympatric herbivores it appears highly unlikely that stripes are a form of anti-predator camouflage.
Lions (Panthera leo) feed on diverse prey species, a range that is broadened by their cooperative hunting. Although humans are not typical prey, habitual man-eating by lions is well documented. Fathoming the motivations of the Tsavo and Mfuwe man-eaters (killed in 1898 in Kenya and 1991 in Zambia, respectively) may be elusive, but we can clarify aspects of their behaviour using dental microwear texture analysis. Specifically, we analysed the surface textures of lion teeth to assess whether these notorious man-eating lions scavenged carcasses during their depredations. Compared to wild-caught lions elsewhere in Africa and other large feliforms, including cheetahs and hyenas, dental microwear textures of the man-eaters do not suggest extreme durophagy (e.g. bone processing) shortly before death. Dental injuries to two of the three man-eaters examined may have induced shifts in feeding onto softer foods. Further, prompt carcass reclamation by humans likely limited the man-eaters' access to bones. Man-eating was likely a viable alternative to hunting and/or scavenging ungulates due to dental disease and/or limited prey availability.
The behavioral strategies developed by predators to capture and kill their prey are fascinating, notably for predators that forage for prey at, or beyond, the boundaries of their ecosystem. We report here the occurrence of a beaching behavior used by an alien and large-bodied freshwater predatory fish (Silurus glanis) to capture birds on land (i.e. pigeons, Columbia livia). Among a total of 45 beaching behaviors observed and filmed, 28% were successful in bird capture. Stable isotope analyses (δ(13)C and δ(15)N) of predators and their putative prey revealed a highly variable dietary contribution of land birds among individuals. Since this extreme behavior has not been reported in the native range of the species, our results suggest that some individuals in introduced predator populations may adapt their behavior to forage on novel prey in new environments, leading to behavioral and trophic specialization to actively cross the water-land interface.
Animals use a variety of escape mechanisms to increase the probability of surviving predatory attacks. Antipredator defenses can be elaborate, making their evolutionary origin unclear. Trap-jaw ants are known for their rapid and powerful predatory mandible strikes, and some species have been observed to direct those strikes at the substrate, thereby launching themselves into the air away from a potential threat. This potential escape mechanism has never been examined in a natural context. We studied the use of mandible-powered jumping in Odontomachus brunneus during their interactions with a common ant predator: pit-building antlions. We observed that while trap-jaw ant workers escaped from antlion pits by running in about half of interactions, in 15% of interactions they escaped by mandible-powered jumping. To test whether escape jumps improved individual survival, we experimentally prevented workers from jumping and measured their escape rate. Workers with unrestrained mandibles escaped from antlion pits significantly more frequently than workers with restrained mandibles. Our results indicate that some trap-jaw ant species can use mandible-powered jumps to escape from common predators. These results also provide a charismatic example of evolutionary co-option, where a trait that evolved for one function (predation) has been co-opted for another (defense).
When identifying potential trophic cascades, it is important to clearly establish the trophic linkages between predators and prey with respect to temporal abundance, demographics, distribution, and diet. In the northwest Atlantic Ocean, the depletion of large coastal sharks was thought to trigger a trophic cascade whereby predation release resulted in increased cownose ray abundance, which then caused increased predation on and subsequent collapse of commercial bivalve stocks. These claims were used to justify the development of a predator-control fishery for cownose rays, the “Save the Bay, Eat a Ray” fishery, to reduce predation on commercial bivalves. A reexamination of data suggests declines in large coastal sharks did not coincide with purported rapid increases in cownose ray abundance. Likewise, the increase in cownose ray abundance did not coincide with declines in commercial bivalves. The lack of temporal correlations coupled with published diet data suggest the purported trophic cascade is lacking the empirical linkages required of a trophic cascade. Furthermore, the life history parameters of cownose rays suggest they have low reproductive potential and their populations are incapable of rapid increases. Hypothesized trophic cascades should be closely scrutinized as spurious conclusions may negatively influence conservation and management decisions.
Alarm communication is a key adaptation that helps social groups resist predation and rally defenses. In Asia, the world’s largest hornet, Vespa mandarinia, and the smaller hornet, Vespa velutina, prey upon foragers and nests of the Asian honey bee, Apis cerana. We attacked foragers and colony nest entrances with these predators and provide the first evidence, in social insects, of an alarm signal that encodes graded danger and attack context. We show that, like Apis mellifera, A. cerana possesses a vibrational “stop signal,” which can be triggered by predator attacks upon foragers and inhibits waggle dancing. Large hornet attacks were more dangerous and resulted in higher bee mortality. Per attack at the colony level, large hornets elicited more stop signals than small hornets. Unexpectedly, stop signals elicited by large hornets (SS large hornet) had a significantly higher vibrational fundamental frequency than those elicited by small hornets (SS small hornet) and were more effective at inhibiting waggle dancing. Stop signals resulting from attacks upon the nest entrance (SS nest) were produced by foragers and guards and were significantly longer in pulse duration than stop signals elicited by attacks upon foragers (SS forager). Unlike SS forager, SS nest were targeted at dancing and non-dancing foragers and had the common effect, tuned to hornet threat level, of inhibiting bee departures from the safe interior of the nest. Meanwhile, nest defenders were triggered by the bee alarm pheromone and live hornet presence to heat-ball the hornet. In A. cerana, sophisticated recruitment communication that encodes food location, the waggle dance, is therefore matched with an inhibitory/alarm signal that encodes information about the context of danger and its threat level.
Here, we document in-vivo bite forces recorded from wild piranhas. Integrating this empirical data with allometry, bite simulations, and FEA, we have reconstructed the bite capabilities and potential feeding ecology of the extinct giant Miocene piranha, Megapiranha paranensis. An anterior bite force of 320 N from the black piranha, Serrasalmus rhombeus, is the strongest bite force recorded for any bony fish to date. Results indicate M. paranensis' bite force conservatively ranged from 1240-4749 N and reveal its novel dentition was capable of resisting high bite stresses and crushing vertebrate bone. Comparisons of body size-scaled bite forces to other apex predators reveal S. rhombeus and M. paranensis have among the most powerful bites estimated in carnivorous vertebrates. Our results functionally demonstrate the extraordinary bite of serrasalmid piranhas and provide a mechanistic rationale for their predatory dominance among past and present Amazonian ichthyofaunas.
Data mining approaches have been increasingly applied to the electronic health record and have led to the discovery of numerous clinical associations. Recent data mining studies have suggested a potential association between cat bites and human depression. To explore this possible association in more detail we first used administrative diagnosis codes to identify patients with either depression or bites, drawn from a population of 1.3 million patients. We then conducted a manual chart review in the electronic health record of all patients with a code for a bite to accurately determine which were from cats or dogs. Overall there were 750 patients with cat bites, 1,108 with dog bites, and approximately 117,000 patients with depression. Depression was found in 41.3% of patients with cat bites and 28.7% of those with dog bites. Furthermore, 85.5% of those with both cat bites and depression were women, compared to 64.5% of those with dog bites and depression. The probability of a woman being diagnosed with depression at some point in her life if she presented to our health system with a cat bite was 47.0%, compared to 24.2% of men presenting with a similar bite. The high proportion of depression in patients who had cat bites, especially among women, suggests that screening for depression could be appropriate in patients who present to a clinical provider with a cat bite. Additionally, while no causative link is known to explain this association, there is growing evidence to suggest that the relationship between cats and human mental illness, such as depression, warrants further investigation.