SciCombinator

Discover the most talked about and latest scientific content & concepts.

Concept: Categorization

234

What determines how languages categorize colors? We analyzed results of the World Color Survey (WCS) of 110 languages to show that despite gross differences across languages, communication of chromatic chips is always better for warm colors (yellows/reds) than cool colors (blues/greens). We present an analysis of color statistics in a large databank of natural images curated by human observers for salient objects and show that objects tend to have warm rather than cool colors. These results suggest that the cross-linguistic similarity in color-naming efficiency reflects colors of universal usefulness and provide an account of a principle (color use) that governs how color categories come about. We show that potential methodological issues with the WCS do not corrupt information-theoretic analyses, by collecting original data using two extreme versions of the color-naming task, in three groups: the Tsimane', a remote Amazonian hunter-gatherer isolate; Bolivian-Spanish speakers; and English speakers. These data also enabled us to test another prediction of the color-usefulness hypothesis: that differences in color categorization between languages are caused by differences in overall usefulness of color to a culture. In support, we found that color naming among Tsimane' had relatively low communicative efficiency, and the Tsimane' were less likely to use color terms when describing familiar objects. Color-naming among Tsimane' was boosted when naming artificially colored objects compared with natural objects, suggesting that industrialization promotes color usefulness.

Concepts: Scientific method, Color, Categorization, Communication, Color theory, Names, Basic Color Terms: Their Universality and Evolution, Linguistic relativity and the color naming debate

110

The capability of animals to emit light, called bioluminescence, is considered to be a major factor in ecological interactions. Because it occurs across diverse taxa, measurements of bioluminescence can be powerful to detect and quantify organisms in the ocean. In this study, 17 years of video observations were recorded by remotely operated vehicles during surveys off the California Coast, from the surface down to 3,900 m depth. More than 350,000 observations are classified for their bioluminescence capability based on literature descriptions. The organisms represented 553 phylogenetic concepts (species, genera or families, at the most precise taxonomic level defined from the images), distributed within 13 broader taxonomic categories. The importance of bioluminescent marine taxa is highlighted in the water column, as we showed that 76% of the observed individuals have bioluminescence capability. More than 97% of Cnidarians were bioluminescent, and 9 of the 13 taxonomic categories were found to be bioluminescent dominant. The percentage of bioluminescent animals is remarkably uniform over depth. Moreover, the proportion of bioluminescent and non-bioluminescent animals within taxonomic groups changes with depth for Ctenophora, Scyphozoa, Chaetognatha, and Crustacea. Given these results, bioluminescence has to be considered an important ecological trait from the surface to the deep-sea.

Concepts: Animal, Greek loanwords, Categorization, Cnidaria, Jellyfish, Taxonomic rank, Ocean, Alpha taxonomy

28

Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d' analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. (PsycINFO Database Record © 2013 APA, all rights reserved).

Concepts: Taxonomy, Categorization, Emotion, Paul Ekman, All rights reserved, Affect display, Emotional expression, Anger

27

Research on scene categorization generally concentrates on gist processing, particularly the speed and minimal features with which the “story” of a scene can be extracted. However, this focus has led to a paucity of research into how scenes are categorized at specific hierarchical levels (e.g., a scene could be a road or more specifically a highway); consequently, research has disregarded a potential diagnostically driven feedback process. We presented participants with scenes that were low-pass filtered so only their gist was revealed, while a gaze-contingent window provided the fovea with full-resolution details. By recording where in a scene participants fixated prior to making a basic- or subordinate-level judgment, we identified the scene information accrued when participants made either categorization. We observed a feedback process, dependent on categorization level, that systematically accrues sufficient and detailed diagnostic information from the same scene. Our results demonstrate that during scene processing, a diagnostically driven bidirectional interplay between top-down and bottom-up information facilitates relevant category processing.

Concepts: Taxonomy, Feedback, Categorization, Top-down and bottom-up design, Library classification, Dichotomies

27

A unified general theory of human concept learning based on the idea that humans detect invariance patterns in categorical stimuli as a necessary precursor to concept formation is proposed and tested. In GIST (generalized invariance structure theory) invariants are detected via a perturbation mechanism of dimension suppression referred to as dimensional binding. Structural information acquired by this process is stored as a compound memory trace termed an ideotype. Ideotypes inform the subsystems that are responsible for learnability judgments, rule formation, and other types of concept representations. We show that GIST is more general (e.g., it works on continuous, semi-continuous, and binary stimuli) and makes much more accurate predictions than the leading models of concept learning difficulty, such as those based on a complexity reduction principle (e.g., number of mental models, structural invariance, algebraic complexity, and minimal description length) and those based on selective attention and similarity (GCM, ALCOVE, and SUSTAIN). GIST unifies these two key aspects of concept learning and categorization. Empirical evidence from three experiments corroborates the predictions made by the theory and its core model which we propose as a candidate law of human conceptual behavior.

Concepts: Psychology, Science, Empiricism, Idea, Thought, Concept, Categorization, Concepts

18

Grouping auditory stimuli into common categories is essential for a variety of auditory tasks, including speech recognition. We trained human participants to categorize auditory stimuli from a large novel set of morphed monkey vocalizations. Using fMRI-rapid adaptation (fMRI-RA) and multi-voxel pattern analysis (MVPA) techniques, we gained evidence that categorization training results in two distinct sets of changes: sharpened tuning to monkey call features (without explicit category representation) in left auditory cortex and category selectivity for different types of calls in lateral prefrontal cortex. In addition, the sharpness of neural selectivity in left auditory cortex, as estimated with both fMRI-RA and MVPA, predicted the steepness of the categorical boundary, whereas categorical judgment correlated with release from adaptation in the left inferior frontal gyrus. These results support the theory that auditory category learning follows a two-stage model analogous to the visual domain, suggesting general principles of perceptual category learning in the human brain.

Concepts: Neuron, Brain, Human brain, Cerebrum, Limbic system, Frontal lobe, Categorization, Premotor cortex

16

Observers can rapidly perform a variety of visual tasks such as categorizing a scene as open, as outdoor, or as a beach. Although we know that different tasks are typically associated with systematic differences in behavioral responses, to date, little is known about the underlying mechanisms. Here, we implemented a single integrated paradigm that links perceptual processes with categorization processes. Using a large image database of natural scenes, we trained machine-learning classifiers to derive quantitative measures of task-specific perceptual discriminability based on the distance between individual images and different categorization boundaries. We showed that the resulting discriminability measure accurately predicts variations in behavioral responses across categorization tasks and stimulus sets. We further used the model to design an experiment, which challenged previous interpretations of the so-called “superordinate advantage.” Overall, our study suggests that observed differences in behavioral responses across rapid categorization tasks reflect natural variations in perceptual discriminability.

Concepts: Scientific method, Psychology, Understanding, Taxonomy, Observation, Perception, Concept, Categorization

16

Designed by biological [1, 2] and social [3] evolutionary pressures, facial expressions of emotion comprise specific facial movements [4-8] to support a near-optimal system of signaling and decoding [9, 10]. Although highly dynamical [11, 12], little is known about the form and function of facial expression temporal dynamics. Do facial expressions transmit diagnostic signals simultaneously to optimize categorization of the six classic emotions, or sequentially to support a more complex communication system of successive categorizations over time? Our data support the latter. Using a combination of perceptual expectation modeling [13-15], information theory [16, 17], and Bayesian classifiers, we show that dynamic facial expressions of emotion transmit an evolving hierarchy of “biologically basic to socially specific” information over time. Early in the signaling dynamics, facial expressions systematically transmit few, biologically rooted face signals [1] supporting the categorization of fewer elementary categories (e.g., approach/avoidance). Later transmissions comprise more complex signals that support categorization of a larger number of socially specific categories (i.e., the six classic emotions). Here, we show that dynamic facial expressions of emotion provide a sophisticated signaling system, questioning the widely accepted notion that emotion communication is comprised of six basic (i.e., psychologically irreducible) categories [18], and instead suggesting four. VIDEO ABSTRACT:

Concepts: Psychology, Evolution, Biology, Species, Categorization, Dynamics, Face, Paul Ekman

13

The aim of this study is to categorise cancers into broad groups based on clusters of common treatment aims, experiences and outcomes to provide a numerical framework for understanding the services required to meet the needs of people with different cancers. This framework will enable a high-level overview of care and support requirements for the whole cancer population.

Concepts: Cancer, Chemotherapy, United Kingdom, Knowledge, Categorization, Requirement

7

The present combined EEG and eye tracking study examined the process of categorization learning at different age ranges and aimed to investigate to which degree categorization learning is mediated by visual attention and perceptual strategies. Seventeen young subjects and ten elderly subjects had to perform a visual categorization task with two abstract categories. Each category consisted of prototypical stimuli and an exception. The categorization of prototypical stimuli was learned very early during the experiment, while the learning of exceptions was delayed. The categorization of exceptions was accompanied by higher P150, P250 and P300 amplitudes. In contrast to younger subjects, elderly subjects had problems in the categorization of exceptions, but showed an intact categorization performance for prototypical stimuli. Moreover, elderly subjects showed higher fixation rates for important stimulus features and higher P150 amplitudes, which were positively correlated with the categorization performances. These results indicate that elderly subjects compensate for cognitive decline through enhanced perceptual and attentional processing of individual stimulus features. Additionally, a computational approach has been applied and showed a transition away from purely abstraction-based learning to an exemplar-based learning in the middle block for both groups. However, the calculated models provide a better fit for younger subjects than for elderly subjects. The current study demonstrates that human categorization learning is based on early abstraction-based processing followed by an exemplar-memorization stage. This strategy combination facilitates the learning of real world categories with a nuanced category structure. In addition, the present study suggests that categorization learning is affected by normal aging and modulated by perceptual processing and visual attention.

Concepts: Present, Psychology, Attention, Cognition, Cognitive science, Learning, Ageing, Categorization