Social media tends to amplify the most sensational content and headlines

Although there has been much work developing and testing CBT interventions for promoting ART adherence, there is still room for improvement because traditional multi-component CBT interventions for ART adherence result in small to medium effect sizes . Overall, the current study extends the literature on distress intolerance as a psychological vulnerability factor among people living with HIV. However, there are some limitations that provide opportunities for future research. First, the present study was cross-sectional, limiting inferences that can be made about directionality. Indeed, it is just as likely that low levels of distress tolerance lead to poor adherence as it is that poor adherence is prospectively associated with low levels of distress tolerance. This may be particularly relevant among immuno compromised individuals living with HIV. For instance, if poor adherence leads to an increasing viral load, then one’s immune system is mobilized to contend with the growing viral load. This is a physiological stressor and stress increases one’s drive to escape from unpleasant situations . Thus, it is feasible that stress due to immunological reactivity from an increasing viral load further limits one’s capacity to exercise tolerance of distress. It is plausible that poor adherence resulting in an increasing viral load may subsequently increase one’s vulnerability to distress intolerance.Following, as adherence was measured using pill count at only one time point, we were unable to establish a baseline level of adherence and MEMS caps would have provided a more precise indicator of adherence. Third, as mentioned earlier, though a strength of the study was the multi-method measurement of distress tolerance, future work would benefit from employing additional objectives and newly refined subjective measures to better understand differential relations between multiple facets of distress tolerance and HIV adherence.

Future work would also benefit from assessing tolerance of HIV symptom-related distress, specifically,rolling bench and the impact of distress tolerance on other clinically relevant HIV outcomes . Finally, though the present study was quite ethnically diverse, a majority of the sample was male. Future work would benefit from recruiting a more gender-diverse sample from different geographic areas. Promoting tolerance of affective distress and distressing tasks associated with the high-adherence demands of ART for HIV management are worthwhile to consider in future research. Future investigations are needed to examine relations prospectively to identify the role of distress intolerance in the development and maintenance of poor HIV management, and then verification of clinical implications through intervention process and outcome studies.A consortium of 67 scientific institutions from 24 European countries and beyond, covering over thirty scientific disciplines ranging from anthropology to toxicology, responded to an invitation by the European Commission to study the place of addictions in contemporary European society. The resulting five-year endeavour, the Addictions and Lifestyles in Contemporary Europe – Reframing Addictions Project , went beyond this. It reframed our understanding of addictions and formulated a blueprint for re-designing the governance of addictions. This paper summarizes the project’s conclusions, pointing to new understandings of the science and policy of nicotine, illegal drugs and alcohol, hereafter collectively referred to as ‘drugs’ . Although this paper does not cover process addictions , much of what is said applies to addictions beyond drugs.It contrasts two powerful pieces of evidence: the harm done by drugs, versus the poorly structured existing governance approaches designed to manage such harm.

The paper continues by considering three bases for re-thinking the addiction concept in ways that could lead to improved strategies across different jurisdictions: recognition that there is a biological predisposition for people to seek out and ingest drugs; that heavy use over time becomes a replacement concept and descriptor for the term substance use disorder; and that quantitative risk assessment can be used to standardize harm across different drugs, based on drug potency and exposure. The paper finishes by proposing two approaches that could strengthen addictions governance: embedding governance within a well-being frame, and adopting an accountability system—a health footprint that apportions responsibility for who and what causes drug-related harm.Governance is defined as the processes and structures of public policy decision making and management that engage people across the boundaries of public agencies, levels of government, and public, private and civic spheres to carry out a public purpose that cannot be accomplished by any one sector alone. The exclusive use of top-down bureaucratic approaches cannot maximize societal benefits when dealing with ‘wicked problems’ that are highly resistant to resolution . An analysis of 28 European countries found that no single country had a comprehensive policy for all drugs within a broad societal well being approach. For more detail, see ‘Governance of Addictions: European Public Policies’, by Albareda A et al. There are at least three reasons for ineffective and poorly integrated governance. Firstly, the same harm done by drugs is defined and understood in different ways in different countries and state systems. Seen from a trans-national comparative perspective, there is a lack of a common understanding of appropriate policies, and responses are often constrained by approaches that are tied to assumptions that are not evidence-based.

Ways of thinking about the harm done by drugs vary enormously, with considerable heterogeneity between different drugs, and between international, national and local levels of governance. For detail, see ‘Concepts of Addictive Substances and Behaviours across Time and Place, by Hellman et al. Secondly, a multitude of commercial, political and public stakeholders are active in addictions governance on national and international levels. In any given society, stakeholders that have power, means and influence are likely to achieve an advantageous influential position. The concepts of addiction are also shaped by popular constructs promulgated by the mass media and customs in the general population. Stakeholder positions and perceptions of drug problems also vary over time and by area4 , implying that sustainable approaches must be interwoven into societal and governance structures. Thirdly, corporate power,dry rack cannabis through multiple channels of influence, can hinder evidence-based policy decisions. Corporate strategies often include attempts to influence civil society, science and the media, as part of a wider aim to manage and, if possible, capture institutions that set policy. Transparency is insufficient given that the multiplicity of channels with corporate power is poorly acknowledged and understood by policy makers. Therefore, the rules in place to ensure level playing fields for discussions and equitable decision-making across all factors are inadequate.The idea that human exposure to drugs did not occur until late in human evolution—thus leaving our species inexperienced—is often posited as one of the reasons that these substances cause so much harm. However, multidisciplinary scientific evidence suggests otherwise. Many substances consumed today are not evolutionary novelties. In the story of terrestrial life over the last 400 million years or so, one ongoing theme has been the “battle” between plants and the animals that eat them. Of the many defence mechanisms in existence, plants produce numerous chemicals, including tetrahydrocannabinol, cocaine, nicotine, and opiates, all of which are potent neurotoxins that deter consumption of plant tissue by animals. From an evolutionary perspective, we thus find natural selection for compounds that discourage consumption of the plant via punishment of potential consumers. By contrast, there has been no natural selection for expression of psychoactive compounds which encourage consumption , which has also been predicted by neurobiological and behavioural psychology theories of reward and reinforcement for contemporary drugs. Counterbalancing the development of plant neurotoxins, planteating animals have evolved to counter-exploit plants’ production of drugs, for instance by exploiting the anti-parasitic properties of some of them.

Many species of invertebrates and vertebrates engage in pharmacophagy, the deliberate consumption and sequestration of plant toxins, to dissuade parasites and predators. In a human context, present day examples of pharmacophagy may be seen with Congo basin hunter gatherers, amongst whom the quantity of cannabis and nicotine consumed is titrated against intestinal worm burden – the higher the intake, the lower the worm burden. In individuals treated with the anti-worm drug abendazole, the number of nicotine-containing cigarettes smoked is reduced. Although parasite-host co-evolution is recognized as a potent selective force in nature, other, subtler evolutionary dynamics may affect human and animal interactions with plant-based drugs, including that they may buffer against nutritional and energetic constraints on signalling in the central nervous system. Ethnographic research reveals that many indigenous groups classify “drugs” as food, rather than psychoactive entities, and that they are perceived as having food-like effects, most notably for increasing tolerance for fatigue, hunger and thermal stress in nutritionally-constrained environments. The causes of these perceived effects have not been a research question, but there are clues that the “food” classification may be literal rather than allegorical. Common plant toxins not only mimic mammalian neurotransmitters, they are also synthesized from the same nutritionally-constrained amino acid precursors, such as tyrosine and tryptophan. In harsh environments, toxic plants could function as a “famine food” providing essential dietary building blocks, or, may function as a direct substitute for nutritionally-constrained endogenous neurotransmitters. There is some evidence to support this hypothesis in animal research; for example, wood rats in cold environments reduce thermoregulatory costs by modulating body temperature with plant toxins consumed from the juniper plant. In the case of ethanol, its presence within ripe fruit suggests low-level but chronic dietary exposure for all fruit-eating animals, with volatilized alcohols potentially serving in olfactory localization of nutritional resources. Molecular evolutionary studies indicate that an oral digestive enzyme capable of rapidly metabolizing ethanol was modified in human ancestors near the time that they began extensively using the forest floor, about 10 million years ago; humans have retained the fast-acting enzyme to this day. By contrast, the same alcohol dehydrogenase in our more ancient and mostly tree-dwelling ancestors did not oxidize ethanol as efficiently. This evolutionary switch suggests that exposure to dietary sources of ethanol became more common as hominids adapted to bipedal life on the ground. Ripe fruits accumulating on the forest floor could provide substantially more ethanol cues and result in greater caloric gain relative to fruits ripening within the forest canopy, and our contemporary patterns of alcohol consumption and excessive use may accordingly reflect millions of years of natural dietary exposure. This evolutionary evidence does not imply that humans also evolved to specifically consume nicotine, for example, or that nicotine use is beneficial in the modern world. What is novel in the modern world is the high degree of availability, and high concentration of psychoactive agents and routes of consumption that promote intoxication. What is different with alcohol in the modern world is novel availability through fermentative technology, enabling humans to consume it as a beverage, devoid of food bulk, with higher ethanol content, and artificially higher salience than that which characterizes fruit fermenting in the wild. The evolutionary evidence has two implications: firstly, policies that prohibit the use of drugs are likely to fail because people have a biological predisposition to seek out chemicals with varying nutritional and pharmacological properties; and secondly, in present-day society, drug delivery systems have been developed that are beyond what is reflected in the natural environment, particularly with respect to levels of potency, availability and taste, which could be argued as being the more central drivers of harm. Potency is largely determined by producer organisations operating in markets, which, from the perspective of overall societal well-being, are inadequately managed. Better regulation of potency can become a major opportunity for additional policy interventions – for example with alcohol, see ‘Evidence of reducing ethanol content in beverages to reduce harmful use of alcohol’ by Rehm et al..To better understand the interference of drugs in human biology and functioning, the consensus reached in ALICE RAP was that the concept and term ‘heavy use over time’ should be proposed as the replacement for ‘substance use disorder’. In medical settings and indeed often in academic and lay settings, heavy users of drugs are commonly dichotomized into those with a ‘substance use disorder’ or not. ‘Substance use disorder’ is a clinical construct that is often used as a shorthand to identify individuals who might benefit from advice or treatment. But as a condition in itself, it is a medical artefact which occurs in all grades of severity, with no natural distinction between ‘health’ and ‘disease’. This is illustrated with alcohol.