Thirty-five percent of the women in this sample were born outside of the United States

Comparisons using data from birth certificate records included race/ethnicity, maternal age, education, payment for delivery, participation in the Women, Infants, and Children program , parity, maternal birthplace, report of smoking during pregnancy, maternal body mass index , trimester when prenatal care began, and number of prenatal care visits.For multi-parous women, we examined the relationship between preterm birth and previous preterm birth, previous cesarean delivery, and interpregnancy interval. Interpregnancy interval was calculated from previous live birth as reported in linked records and estimated as months to conception of the index pregnancy. Given that the day of previous live birth was not available, the middle of the month was used for calculation purposes.Factors from hospital discharge ICD-9 diagnoses included: Preexisting hypertension without progression to preeclampsia, preexisting hypertension with progression to preeclampsia, gestational hypertension without progression to preeclampsia, gestational hypertension with progression to preeclampsia, preexisting diabetes, and gestational diabetes. We also compared preterm birth with respect to the frequency of coded infection, anemia, drug or alcohol dependence/abuse, and mental disorder . Multi-variable models of maternal risk and protective factors for preterm birth were built for each location of residence category using backwards-stepwise Poisson logistic regression wherein initial inclusion was determined by a threshold of p < .20 in crude analyses. Adjusted RRs and their 95% CIs were calculated for each residence stratum. In an effort to visualize overall risk of preterm birth by census tract,planting growing racks cumulative risk scores estimated the overall risk of preterm birth.

Scores were calculated for each woman by adding her risks and subtracting her protective factors – 1) remaining in the final multi-variable model. Risk scores were grouped into scores 0.0 or less, 0.1 to 0.9, 1.0 to 1.9, 2.0 to 2.9 and 3.0 or more. Drug dependence/abuse and mental illnesses were further classified based on ICD-9 diagnostic codes, although risks calculations were not computed due to small numbers. Drug dependence/abuse was defined by classification of drug: opioid, cocaine, cannabis, amphetamine, other drug dependence/abuse, and poly substance dependence/abuse. Mental illnesses were further classified as: schizophrenic disorders, bipolar disorder, major depression, depressive disorder, anxiety disorders, personality disorders, and more than one of the previously mentioned categories. Infection was further classified as asymptomatic bacteriuria, urinary tract infection, sexually transmitted infection, and viral infection . Additionally, rates of preterm birth by subgroup were examined. As previously described,pregnancies resulting in spontaneous preterm birth were considered to be those where birth certificate or hospital discharge records indicated premature rupture of membranes , premature labor, or those for whom tocolytic medications were administered. Pregnancies resulting in provider initiated preterm births were considered to be those without PROM, premature labor or tocolytic administration for which there was a code for “induction” or “artificial rupture of membranes”; or for which there was a cesarean delivery without any of the aforementioned codes. All analyses were performed using Statistical Analysis Software version 9.4 . Methods and protocols for the study were approved by the Committee for the Protection of Human Subjects within the Health and Human Services Agency of the State of California. Data used for the study were received by the California Preterm Birth Initiative at the University of California San Francisco by June 2016. The sample included 81,021 women: 29,052 with urban residence, 24,377 with suburban residence and 27,592 with rural residence. The majority of the women in the sample were Hispanic , between 18 and 34 years at delivery , WIC participants , and multi-parous .

The demographic makeup of the three residence locations differed. For example, 8.0% of the urban population, 6.7% of suburban mothers, and 2.2% of the rural population was Black race/ethnicity . Nine percent of women in urban residences, 8.0% of women in suburban residences, and 8.2% of women in rural residences delivered preterm . Of these, 1.4% of women living in urban residences delivered before 32 weeks, while 1.1% of women in suburban or rural residences delivered this early. More specifically, 1.7% of women in the Fresno East Central MSSA delivered before 32 weeks . Four individual census tracts within urban MSSAs had rates of birth at less than 32 weeks’ gestation of 2.0% or greater, with an n of 16 or more as the reporting threshold . In the final multi-variable logistic models, Black women were found to be at elevated risk of preterm birth across all residence strata . Similarly, women with inter pregnancy intervals less than six months were at elevated risk across residence strata . Women with comorbidities such as preexisting and gestational diabetes, preexisting hypertension, and infection were also at increased risk of having a preterm birth. Other factors, such as public insurance for delivery, less than 12 years of education, underweight BMI, and an inter pregnancy interval over 59 months, were only risk factors for women living in urban residences. Only Hispanic women in rural residences were at increased risk of preterm birth . Women living in urban and rural residences who participated in WIC were less likely to deliver preterm . For urban women, birth in Mexico and overweight BMI also showed a protective effect to preterm birth . Not only did the risk models differ by residence within Fresno County, but the percentage of women with the risk varied greatly for some factors. In urban residences, 12.2% of women with preterm births smoked, while 6.6% of women in rural residences with preterm birth smoked. Similarly, 8.9% of urban women with a preterm birth used drugs or alcohol and 4.4% women in rural residences with preterm birth did.

Nearly five percent of urban women delivering preterm had fewer than three prenatal care visits and 2.3% of women in suburban residences had this few number of visits. The percent of women with a preterm birth and with inter pregnancy intervals less than six months ranged from 7.7% to 11.2% . When examining these risk factors in more geographic detail, appropriate targets for preterm birth reduction are elucidated. For instance, in six census tracts 15% or more mothers of preterm infants smoked during their pregnancy – four in urban residences and two in suburban residences . Also, five census tracts in urban residences show that over 10% of mothers who delivered preterm used drugs or alcohol . Over 2,600 women delivering in Fresno County had a cumulative risk score for preterm birth 3.0: 2.2% of women living in urban residences, 4.1% in suburban, and 3.7% in rural residences had this high risk score . In this study of preterm births in Fresno County, we found that differences in the type and magnitude of risk and protective factors differed by the residence in which women reside. Black women and women with diabetes, hypertension, infection, fewer than three prenatal care visits, previous preterm birth or inter pregnancy interval less than six months were at increased risk of preterm birth, regardless of location of residence. Public insurance, maternal education less than 12 years,plant racks for vertical growing underweight BMI, and inter pregnancy interval of five years or more were identified as risk factors only for women in urban residences. Women living in urban locations who were born in Mexico and who were overweight by BMI were at lower risk for preterm birth; WIC participation was protective for women in both urban and rural locations. Taken together, these findings suggest targeted place-based interventions and policy recommendations can be pursued. The preterm birth risk factors identified in these analyses are not unique to Fresno County: previous work has also shown that women of color, lower education, lower socioeconomic status, women with co-morbidities such as hypertension and diabetes, smoking, and short inter pregnancy interval are at elevated risk of preterm birth.In Fresno County, however, we observed that these risks differ in magnitude. This is critical, as the percentage of women in each region with the risk factor can vary greatly. Hispanic women were at increased risk of preterm birth in rural residence. The degree of risk was mild – only a 1.1-fold increase in risk. However, 72% of the population giving birth in rural Fresno County is Hispanic, suggesting that focusing interventions reaching this population may provide the most impact. Similarly,Black women were at elevated risk of preterm birth regardless of location of residence. Since urban residences have the highest percentage of Black women and rural has the lowest , focusing prevention efforts for Black women in urban residences may be an effective approach. Others have found that with pre-pregnancy initiation of Medicaid , has been associated with earlier initiation of prenatal care,a factor that may reduce preterm birth rates.In addition, participation in the WIC program also has shown a moderate reduction of the risk of a small for gestational age infant and has been associated with reduced infant mortality in Black populations.Fresno women from both urban and rural residences who participated in the WIC program were less likely to deliver preterm, while those women living in urban locations who were publicly insured through Medi-Cal coverage for delivery were at increased risk for preterm birth.

Low income is a criterion for both public assistance programs, and over 32% of families in this region lives below the poverty line;it is apparent that social economic status is a complex risk factor for preterm birth. A key take away message from this study is that women who accessed prenatal care more frequently – three or more prenatal care visits – were less likely to deliver preterm. Fresno County may be able to improve preterm birth rates by addressing factors that encourage prenatal care access, which may include enrollment in Medi-Cal during the preconception period and increasing WIC participation. Identifying regions where a high percentage of women do not access three or more prenatal care visits may suggest locations for an intervention such as home visits or mobile clinic. Using a large administrative database allows for examination of rates and risks that would not be possible with other data sources. Despite these strengths, the study has some critical limitations. By design, the findings are very specific to one area of California and may not be as applicable to other areas of the state, country, or world. In fact, we recently conducted a similar study examining preterm birth risk factors by sub-type for all of California.Our findings in Fresno County identified both similar and different risk factors for preterm birth. Similar to the entire California population, we demonstrated increased risk of preterm birth for Fresno County women who were of Black race/ethnicity, who had diabetes or hypertension during pregnancy, or who had a previous preterm birth. However, Fresno County was different from the whole state in a few ways. Unlike the state of California as a whole, Hispanic women, women over 34 years at delivery, and underweight women in urban residences in Fresno County were at increased risk for preterm birth. Also, education over 12 years did not provide protection against preterm birth in any of the Fresno County residences, although higher education did provide protection when we looked at the whole state of California. These differences point to specific pathways occurring in Fresno County that may be distinct from the state as a whole, and demonstrate the value of place-based investigation of risk factors when examining a complex outcome such as preterm birth. Other residences may benefit from similar analyses to identify risk and protective factors that are important on a local level. An additional limitation, as with most administrative databases, is that accuracy and ascertainment of variables is not easily validated. Previous studies of California birth certificate data suggests that race/ethnicity is a valid measure of self-identified race/ethnicity for all but Native Americans, and best obstetric estimate of gestation may underestimate preterm delivery rates.Previously reported rates of preterm birth in Fresno County are around 9.5% and was 8.4% overall in our population after removing multiple gestation pregnancies and pregnancies with major birth defects. Additionally, United States estimates for drug dependence/use during pregnancy is 5.0% to 5.4% and was only 2.5% in our population. This under ascertainment may mean that we are capturing the most severe diagnoses, potentially overestimating our risk calculations. Alternatively, under ascertainment also implies that drug users were likely in our referent population, which would underestimate our risk calculations. This examination of Fresno County preterm birth may provide important opportunities for local intervention. Several populations were identified as at risk, regardless of location of maternal residence, that deserve targeted interventions.

Hyper connectivity may be attributable to age differences between the two studies

In a follow-up study taking a multivariate factor analytic approach, Jones et al. found schizophrenia PRS was significantly associated with multiple psychopathology factors . However, these specific effects vanished when including a general psychopathology factor, suggesting that psychopathology during adolescence may be explained with one broad factor. PS during adolescence are rather non-specific and pose risk for a variety of severe mental illnesses. Loohuis and colleagues therefore utilized a novel multi-trait approach including PRS of a broad range of psychiatric disorders, including neurodevelopmental disorders as well as brain and cognitive traits, to assess the association between these genetic risk factors and PS in youth. Interestingly, the ADHD PRS was the only significant predictor of PS in youth of European-American ancestry in the PNC , even after removing individuals endorsing any ADHD symptoms to avoid confounds related to phenotypic overlap . This finding was replicated in a sample of help-seeking CHR individuals. Further, the association between PS and ADHD PRS was age-dependent, such that the association was strongest in younger children . It is noteworthy that for individuals < 12 years only collateral information on psychopathology was available, which could affect the results. In addition to polygenic risk , recent exome sequencing studies have also found that rare and ultra-rare variants contribute to the genetic risk of schizophrenia . Overall, findings from these studies highlight the complex association between genetic risk and PS during adolescence. While such symptoms may be non-specific,indoor plant grow racks and presage later severe mental illnesses, polygenic risk may be indexing global psychopathology as well as risk for specific diagnostic entities.

Importantly, because PRS are currently derived from almost entirely European cohorts, their application to non-European ethnic groups is problematic ; collection of ethnically diverse samples is a research imperative. Further, while PRS are far from clinical utility in the general population, as ever-increasing GWAS size improves the strength of these associations, these risk scores may approach clinical utility in enriched populations in the near future.Examples of publicly available population-based datasets in youth that include multi-modal imaging and neurocognitive assessments are the PNC and the Adolescent Brain Cognitive Development study. These samples offer unprecedented opportunities for the neuroscience community to study complex brain-behavior interactions during development. In particular, longitudinal data will allow for unique investigations of developmental trajectories. Given the young age of ABCD participants at study baseline it has the potential to capture earliest signs of emotional and behavioral problems associated with subsequent severe mental illnesses. Table 2 summarizes large scale epidemiological cohorts with multi-modal imaging. The PNC has led to a wealth of new findings regarding structural and functional brain alterations in youth experiencing PS; 1,445 youth aged 8 to 21 years were recruited from the greater Philadelphia area and underwent genotyping, multi-modal imaging, and neuropsychological testing. This sample was not ascertained for specific neuropsychiatric problems and includes multi-ethnic youth from various socio-economic backgrounds. Exclusion criteria were limited, and included significant medical problems, intellectual disability, neurological and/or endocrine conditions, and general MRI contraindications . Importantly, all studies on PS in the PNC applied the same diagnostic criteria, offering comparability across studies . Furthermore, neuroimaging data were acquired with a single MRI scanner, reducing artifacts and heterogeneity due to scanner and study site variability.Gray and white matter morphology have been investigated in detail in the PNC. Reductions in local gray matter volume in youth experiencing PS relative to typically developing youth were observed in bilateral medial temporal lobes, and were also associated with PS severity .

Further, a significant age by group interaction suggested that these local reductions in gray matter volume only became apparent in mid-adolescence in youth experiencing PS. This pattern of volume reductions in medial temporal regions mirrors a wealth of such findings not only in individuals with chronic schizophrenia, but also in individuals with first episode psychosis as well as in individuals at clinical high-risk for developing psychosis . Given that the medial temporal lobe in this study included both the amygdala as well as parahippocampal cortex, this finding was followed up with a more detailed parcellation of the temporal lobe: whereas decreased volume of the left amygdala was associated with positive PS, decreased volume of the left entorhinal cortex was correlated with impaired cognition as well as more severe negative and disorganized symptoms , suggesting that variation in these brain structures may contribute to distinct symptom domains. Jalbrzikowski et al. subsequently investigated whole-brain morphology differences in cortical thickness, surface area, and sub-cortical volume in PS youth in this cohort, relative to both youth with bipolar mood symptoms and typically developing youth . This study found thalamic volume reductions that were specific to PS. Again, these findings parallel those observed in individuals with overt psychosis and those at CHR , highlighting the role of the thalamus in neural system disruptions in psychosis. In terms of white matter microstructure, youth with PS also exhibited reduced fractional anisotropy in the retrolenticular internal capsule and the superior longitudinal fasciculus , possibly reflecting altered axonal diameter and/or myelination . Development of the SLF was associated with cognitive maturation in typically developing youth, an effect that was absent in youth experiencing PS. Overall, alterations of brain morphology observed in these non-clinically ascertained cohorts of youth experiencing sub-threshold PS can be interpreted as further evidence for a psychosis continuum, given qualitatively similar alterations observed in individuals with overt illness and those at CHR for psychosis.

In terms of functional MRI, task-based brain function and resting state functional connectivity have both been investigated in population-based studies of PS. In the PNC, two MRI paradigms have been acquired: an n-back task probing different working memory loads and an emotion identification task. Working memory is viewed as a function of higher cognitive/ executive functioning consistently shown to be impaired in schizophrenia . Similarly, a wealth of evidence exists for impaired emotional processing in schizophrenia . Wolf et al. found reduced activation in the executive control network in response to increasing working memory demands,indoor plant growing rack concomitant with worse performance, in PS youth relative to typically developing peers . Amygdala activation in response to threatening facial expressions was increased in PS youth compared to unaffected youth and was also positively correlated with positive symptom severity . Utilizing data from the IMAGEN study that included longitudinal fMRI and measures of PS at follow-up, Papanstasiou et al. observed increases in right frontal activation during reward anticipation and feedback of win from age 14 to 19 that was associated with PS at age 19; this increase over time was not observed in youth who did not report PS. The authors speculate whether this finding could be a possible compensatory mechanism. However, given that PS were not assessed at age 14, results are to be interpreted with caution. Resting-state fMRI has become a popular tool to study how distant brain areas are functionally connected. Unlike task-based fMRI, it is less susceptible to performance and vigilance differences between groups, which facilitates interpretation of group differences. Again, the PNC has allowed large-scale investigation of functional connectivity across development. With regard to static functional connectivity , Satterthwaite et al. showed that PS youth exhibited similar patterns of dysconnectivity to patients with overt psychosis. In particular, they observed hyper connectivity within the default-mode network and reduced functional connectivity within the executive control network . However, in one of the largest pediatric population-based samples Karcher et al. recently reported hypoconnectivity within the DMN and within the executive control networks that is associated with increased PS in 9- to 11-year old children . These differences in observed hypo- vs.Nevertheless, there has been a similar dissonance in adult cohorts with overt psychosis, where both hypo- and hyper connectivity of the DMN and executive control networks has been described .

In an elegant follow-up study that applied multivariate sparse canonical correlation analysis to the PNC resting state data, Xia and colleagues corroborated that in fact the segregation between the DMN and executive control networks is a common feature across multiple psychopathology dimensions, but the psychosis dimension shows the strongest effect . Moreover, a recent study of this cohort that investigated dynamic properties of functional connectivity, i.e., time-varying patterns of whole-brain connectivity, found that previously described dysconnectivity between the DMN and executive control networks in youth experiencing PS is time-dependent, and only occurs during certain periods of a resting-state scan, whereas dysconnectivity in visual and sensorimotor areas is much more pervasive . The Human Connectome Project is an adult cohort in which resting-state fMRI as well as self-reported PS were acquired. Here, PS were significantly inversely correlated with cognitive abilities, an effect that was partially mediated by global efficiency of the executive control network, a measure of network integration . With regard to dynamic functional connectivity in the HCP, it has recently been shown that adults experiencing PS spend more time in a dynamic state, i.e., a distinct time-varying connectivity pattern, characterized by reduced connectivity within the DMN ; a finding that mirrors previous results in studies on individuals with overt psychosis .Even though pediatric population neuroscience is still in its infancy, studies overwhelmingly find that PS in childhood and adolescence pose a risk factor for later development of overt psychiatric illness, and are overall associated with reduced functioning and quality of life. Many early intervention specialty programs offer a coherent multi-modal treatment framework for clients, including psychopharmacological treatment, psychotherapy and psychoeducation as well as vocational counseling. Meta-analytic results suggest that multidisciplinary therapies can delay or prevent transition to overt psychosis . Low risk psychosocial interventions targeting functioning have been shown to be effective in CHR youth; such approaches are likely to be also effective in a broader audience . These results find consideration in the recently published guidelines of the European Psychiatric Association where a dual treatment consisting of cognitive behavioral therapy and pharmacological treatment yields recommendation grade A for adult CHR individuals. For children and adolescents experiencing PS, as targeted by pediatric population neuroscience, the expert recommendation is specific psychological interventions to improve functioning and close monitoring of PS. PS are often preceded by non-specific behavioral and emotional problems in childhood related to increased adversity and trauma. Since these precursors in themselves pose a risk for development of diverse psychopathologies, we argue – as others before us – that these childhood-onset problems offer another promising target for population-based preventive interventions. However, causal mechanisms from abnormal neurodevelopment to subsequent psychopathology are not yet understood and require further longitudinal research. Since only a minority of individuals with PS access appropriate mental health services, it will be important to implement services appropriate to a broad audience, for example in schools. It will be essential to identify those individuals at highest risk, and to reduce the number of false positives in order to provide cost effective services and to reduce stigma. Individual risk calculators developed and tested in CHR cohorts may not work as well when broadening the target audience. With sufficient longitudinal data, questionnaires such as the Psychosis Questionnaire, Brief Version may be amenable for community samples, and may be used to develop risk calculators for youth in the general population. Given the evidence presented here and results from the Outreach and Support in South London and Headspace initiatives , we argue that findings from population-based studies are adequate for guiding policy-making toward further emphasis on public health efforts, although more systematic research is needed in this area. Destigmatization initiatives for mental illness have been shown to be effective in reducing discrimination and stigma , and broadly accessible mental health programs like Headspace and Jigsaw are promising to make a difference in the field of adolescent mental health . However, the specific efficacy of these programs warrants further study, and caution is advised to not over-pathologize potentially transient occurrence of mental health problems.The prevalence of alcohol, tobacco, and other substance use is higher among gay, bisexual, and other men who have sex with men than in the overall population . Although Hughes and Eliason noted that substance and alcohol use have declined in lesbian, gay, bisexual, and transgender populations, the prevalence of heavy alcohol and substance use remains high among younger lesbians and gay men, and in some cases older lesbians and gay men.

Resting-state functional connectivity was largely unaltered in these youths

Children of heavier reducers also reported greater withdrawn or depressed behavior, attention deficits, rule breaking behavior, and aggression compared with children of light reducers. Significant associations are presented in Figure 4. Youths with exposure to any pattern of drinking exhibited greater total cerebral volume relative to unexposed youths in covariate-adjusted models. Regional brain volume and surface area disparities were also observed for all prenatal alcohol exposure groups compared with unexposed youths, although no significant differences were observed between prenatal alcohol exposure groups. When gradations of use were explored separately for heavier reducers , similar results were found for both groups. Results of all psychological, behavioral, cognitive, and neural indices analyses are provided in Table S10 in the online supplement.Structural brain indices were negatively associated with psychological and behavioral outcomes and partially mediated all significant associations between prenatal alcohol exposure and neurobehavioral outcomes in covariate-adjusted cross-sectional models . Inconsistent mediation was observed, where at least one of the mediated effects occurred in a different direction to the direct effect ; for example, prenatal alcohol exposure was significantly associated with greater brain volume and surface area and with greater psychopathology and behavioral problems, while greater brain volume and surface area were negatively associated with psychopathology and behavioral problems. Conversely,greenhouse growing rack for Flanker Task attention and inhibitory control performance, consistent positive associations were observed.

Longitudinal mediation models replicated associations between prenatal alcohol exposure, varying baseline structural brain indices, and follow-up psychopathology and externalizing disorders . To our knowledge, this is the largest examination of prenatal alcohol exposure and psychological, behavioral, and neurodevelopmental outcomes in preadolescence. The estimated total number of drinks consumed during pregnancy ranged from 0 to 90 following outlier conversion. This alcohol dose is relatively low, and the parent-reported exposure patterns prevalent in the ABCD cohort are more typical and reflective of the general population than those investigated in previous studies of fetal alcohol spectrum disorder . Prenatal alcohol exposure of any severity was associated with greater psychopathology, impulsivity, and likelihood of being diagnosed with separation anxiety and oppositional defiant disorder, with some observed dose-related associations. Heavier exposure was also associated with greater withdrawn or depressed behavior, attention deficits, rule breaking, aggression, and a greater likelihood of being diagnosed with ADHD. Early, light exposure, compared with no exposure, was associated with better attention and inhibitory skills. Exposed youths also exhibited greater cerebral volume, in a dose-dependent manner, and greater volume and surface area, but not cortical thickness, throughout regions of the parietal, temporal, and occipital lobes, after accounting for potentially confounding factors. Aberrant brain structure partially mediated associations between prenatal alcohol exposure and psychological, behavioral, and cognitive outcomes at baseline and at the 1-year follow-up. These reported associations passed a stringent demographic-matching protocol. Unmodifiable factors greatly contributed to the large effect sizes in the adjusted models.

Of the modifiable factors, prenatal alcohol exposure was a critical determinant of brain structure, and some neurobehavioral outcomes, accounting for >50% of the explained variance by modifiable factors. The findings were in a largely substance-naive cohort of youths , allowing for investigation of the effects of prenatal alcohol exposure on the developing brain and behavior in the absence of youths’ own substance use, which is known to affect neurodevelopment .Our findings replicate previous clinical studies indicating that children exposed to alcohol in utero have higher rates of mental disorders and present with behavioral anomalies, including impulsiveness and attention deficits . Results from our dose-dependent and exposure pattern analyses support the notion that the severity of psychopathology and behavioral problems depends on alcohol dose and timing of exposure. The present results are also consistent with previous reports using the ABCD cohort of associations between psychopathology, brain structure, and resting-state functional connectivity . Consistent with previous meta-analyses, a small, beneficial association between prenatal alcohol exposure and cognitive ability was observed . However, when participants were demographically matched, the vast majority of associations were no longer significant. This association may be the result of residual confounding from socioeconomic status and other demographic variables, as previously hypothesized . Other confounding variables not captured in this analysis may be contributing to the positive association between early, light exposure and attention and inhibition. The long-term neurostructural and functional effects of light maternal drinking, where offspring who do not necessarily present with fetal alcohol spectrum disorder, have not been well studied. Consistent with our findings, one study has reported larger regional volume among youths prenatally exposed to alcohol relative to unexposed youths . However, in contrast to our results, a common finding, when investigated both categorically and continuously, has been less volume and surface area among youths with fetal alcohol spectrum disorder and those with heavier prenatal alcohol exposure compared with unexposed youths . Furthermore, a previous study of youths with fetal alcohol spectrum disorder reported hypoconnectivity between numerous large-scale neurocognitive networks , yet in the present study, no significant alterations in resting-state functional connectivity were observed within or between these networks . The disparate findings may be explained by the large discrepancies in clinical severity of prenatal alcohol exposure between the ABCD sample and previous cohorts.

The impact of heavier prenatal alcohol exposure may have a differential effect on preadolescent brain structure and function. Interestingly, some regions of the occipital, temporal, and parietal lobes exhibited an inverted-U association between alcohol dose and volume or surface area . It is possible, therefore, that we would have observed reduced volume and surface area among youths exposed to heavier doses . Furthermore, potentially confounding factors in previous studies of children with heavier prenatal alcohol exposure or fetal alcohol spectrum disorder may contribute to the discrepant findings, such as greater co-occurring substance exposure, early-life stress, and quality of parental care. Importantly, our findings suggest that youths exposed to even light alcohol doses in utero exhibit widespread differences in brain structure, when compared with unexposed youths. Finally, our results are consistent with previous studies of children with fetal alcohol spectrum disorder that have linked behavioral, psychological, and cognitive outcomes to changes in brain structure . However, our study is the first to test and identify inconsistent mediation between these variables . Similar to previous conclusions drawn on the effects of prenatal alcohol exposure , our results suggest that there is no safe threshold for alcohol consumption during pregnancy.Alcohol is a known teratogen in utero, and it is thought to affect regions of the developing fetal brain via neural proliferation and migration errors, hypoxia, and cell death . The teratogenic effects likely differ as a result of dose, frequency, and timing of exposure and may vary across brain regions. Our findings demonstrate that there are complex effects of prenatal alcohol exposure on offspring development. Here, we provide four potential interpretations of mechanisms underlying associations between prenatal alcohol exposure, differences in brain structure, and neurobehavioral consequences. First, our results may reflect a compensatory response of some brain regions attempting to counter the effects of other,grow rack system poorer functioning regions affected by low alcohol doses . Our inconsistent mediation findings provide some support for this interpretation, where greater brain volume and surface area were associated with better neurobehavioral outcomes, yet youths who were exposed to alcohol in utero exhibited greater volume and surface area but more neurobehavioral problems at baseline and follow-up. Despite a potential compensatory response of the brain to counter the effects of relatively low doses of alcohol, these youths continue to show subtle, yet poorer, psychological and behavioral outcomes through early life. Second, our findings may also suggest that relatively light prenatal alcohol exposure may result in slightly atypical neurodevelopment. Such exposure may slow or alter the overall process of gray matter maturation, where greater absolute volume and surface area in exposed youths represent delayed or incomplete cortical pruning compared with this process in unexposed, prepubertal youths . Consistent with this hypothesis, we observed this trend largely in regions where gray matter loss in unexposed children progresses linearly from childhood through adolescence .

Typically among this age group, the left hemisphere matures earlier than the right . Greater volume and surface area among exposed youths in left posterior cortices known to develop most rapidly between childhood and adolescence provide further support of delayed development. Examining the developmental trajectories of this cohort when multiple waves of imaging data are available will provide further insight into whether atypical development is occurring among exposed youths. Third, the inconsistent mediation findings may also be partly capturing the effects of the inverted-U associations between total alcohol dose and regional brain volume and surface area. Youths exposed to greater alcohol doses exhibited greater psychopathology and behavioral problems between ages 9 and 10 than youths exposed to lighter doses , and these more heavily exposed youths exhibited lower volume and surface area in regions of the parietal and temporal lobes than youths exposed to lighter doses. Lastly, there may be other critical changes resulting from prenatal alcohol exposure that mediate associations with brain structure differences and psychological and behavioral outcomes. For example, ethanol provokes a wide range of epigenetic modifications, including altered DNA and histone methylation, which persist from birth through childhood . Animal studies suggest that prenatal alcohol exposure affects DNA methylation through antagonistic effects on methyl donors, such as folate, and via long-lasting changes in gene expression . Preliminary evidence from studies of children with fetal alcohol spectrum disorder show genome-wide differences in DNA methylation . Further research is required to examine epigenetic markers and their role in adverse outcomes among exposed youths; DNA methylation or other epigenetic markers could potentially provide objective indicators of prenatal alcohol exposure. Limitations of our study include potential maternal underreporting of alcohol use during pregnancy, imprecise retrospective data on the timing, amount, and frequency of alcohol exposure, and absence of data on trimester-specific alcohol exposure. The effects of under reporting by mothers who indicated alcohol use during pregnancy may have inflated the observed associations, while under reporting by mothers who indicated no alcohol use when they did in fact consume alcohol would have attenuated the associations toward the null. Future studies may benefit from interviewing an independent reporter of prenatal maternal alcohol use. Furthermore, data were not available on mothers who regularly consumed less than a full unit of alcohol. Therefore, youths exposed to this pattern of drinking would have been included in the unexposed group, potentially diluting outcome effects. Despite the large sample size, there were relatively few cases of youths exposed to stable light drinking throughout pregnancy, and too few cases of stable heavier drinking or increased consumption throughout pregnancy, to examine the impact on offspring. There is a larger body of existing evidence based on the consequences of heavier alcohol exposure . The small sample size of youths exposed to light, stable drinking throughout pregnancy resulted in wider variance in outcome measures and may underestimate the true impact. Other notable explanatory variables of early life that may influence the observed associations between prenatal alcohol exposure and neurobehavioral outcomes include childhood adversity and quality of parental care. These variables may contribute to mediating effects of neurodevelopment and possible epigenetic modifications . The baseline ABCD Study protocol did not capture these variables, although future waves will. Longitudinal analyses of this cohort should consider these variables as possible confounding factors. In addition, we did not examine the effect of preconception paternal alcohol exposure on preadolescent brain structure, and this should be explored in future studies.In conclusion, relatively light levels of prenatal alcohol exposure were associated with small yet significantly greater psychological and behavioral problems, including internalizing and externalizing psychopathology, attention deficits, and impulsiveness. These outcomes were linked to differences in cerebral and regional brain volume and regional surface area among exposed youths ages 9 to 10 years. Examination of dose-dependent relationships and light alcohol exposure patterns during pregnancy shows that children with even the lowest levels of exposure demonstrate poorer psychological and behavioral outcomes as they enter adolescence. Associations preceded offspring alcohol use and were robust to the inclusion of potential confounding factors and during stringent demographic matching procedures, increasing the plausibility of the findings. Women should continue to be advised to abstain from alcohol consumption from conception throughout pregnancy. The past few years has shown remarkable advances in the treatment of acute ischemic stroke , particularly as they relate to revascularization.

Is Vertical Farming Cost Effective

While approximately half of participants from both study samples reported experiencing recent violence, violence did not predict unsuppressed VL in either study. In combination, results may suggest differential effects of violence on women’s HIV outcomes in low-income populations where competing factors like homelessness and incarceration outweigh or otherwise obscure violence effects. Prior evaluations by the WHO of HIV clinical effectiveness in resource-limited regions led to the realization that HIV clinical guidelines developed in resource-rich areas were neither feasible nor realistic for resource-limited areas. Consequently, large-scale efforts were launched to develop public-health approaches specific to providing ART in resource-limited areas, taking into account the realities of lower-capacity health systems. Findings presented here suggest that translating clinical findings from resource-rich regions back to the same geographic area may not be feasible or realistic for populations experiencing severe disparity relative to the general population of PLWH. Analogous to taking the realities of lower capacity health systems into account, our results suggest that program development in high income areas must take the realities of disparity into account with interventions that not only consider housing and living conditions of extremely low-income individuals, but also prioritize them. Promising examples of such programs include several developed by the San Francisco Department of Public Health. For example, the “Linkage, Integration, Navigation, Comprehensive Services” program provides field-based navigators who offer short-term “intensive case management” to link homeless PLWH to HIV primary care. In addition,seedling grow rack the “HIV Homeless-Health Outreach Mobile Engagement” program provides stabilization and out of clinic health care to PLWH who have complex needs and are not engaged in care.

Examples also include community-based programs that address the overall well-being of people living on the street such as “Lava Mae,” a program that uses an approach known as “radical hospitality” to provide services, including mobile showers, toilets, clothing, food and employment assistance. Results presented here suggest that additional novel programs for WLWH following incarceration may further address unsuppressed VL in this population. There are several limitations regarding this study. First, VL data were obtained from public and UCSF-affiliated clinics, which may have left out individuals receiving care from private physicians outside of UCSF. However, only 8% of persons receiving HIV care were excluded based on the absence of VL data, likely making potential effects small. Second, participants for whom VL data within three months of a study visit did not exist and were therefore excluded were less likely to have an HIV case manager than participants who were included. Therefore, participants with case management and/or frequent clinic visits may have inflated results. While these individuals had more opportunity for their information to be inside the three-month window of time to assess VL following a study visit, we minimized potential bias from multiple measures by the use of robust standard errors, which down-weighted data for participants with relatively frequent results. We tested this assumption and confirmed that the number of observations was indeed not significantly associated with unsuppressed VL . In addition, while only two participants were newly diagnosed, time in care may be an unmeasured confounder in this study. Finally, study data came from one geographic location and focused on a high-risk population, which may limit generalizability. Future studies in different geographic areas that include more participants and multiple types of housing instability may contribute to a better understanding of variations in viral suppression.

Study strengths include a community-based probability sample of women with a history of housing instability; assessment of recent substance use and victimization at multiple time points; inclusion of multiple living conditions and a focus on the specific needs of extremely low-income women. Finally, study assessments were made in a resource-rich city where universal ART and integrated services were standard and viral suppression rates were high [58], which reduced unmeasured confounding due to service limitations.Cocaine use is common among individuals infected with human immunodeficiency virus . The high comorbidity between HIV and cocaine use highlights the importance of ascertaining the potential compounding effect of HIV infection and cocaine use on neuropsychological functioning . Neuroimaging studies have found significant overlap with respect to changes resulting from HIV infection and cocaine use. Both conditions are associated with grey and white matter pathology in frontal, temporal, and cerebellar regions . Despite widespread use of highly active anti-retroviral therapies and subsequent reduced incidence of HIV-related dementia and cognitive dysfunction , a subset of those with HIV still demonstrate neuropsychological deficits . HIV-associated cognitive impairments include deficits in attention , processing speed and psychomotor abilities , working memory , and executive functioning . Among the neuropsychological domains impacted by HIV, verbal memory deficits have received considerable attention. Verbal memory impairment in HIV is associated with functional declines, such as decreased employability, poor medical adherence, and other functional impairments . Some studies suggest that verbal memory impairment in HIV is due to a retrieval deficit, evidenced by impairments in recall in contrast to intact recognition or cued recall . Indeed, other work suggests that encoding deficits may also play a significant role in HIV-related verbal memory impairment. Specifically, studies have demonstrated HIV-related increased recency effects , decreased primacy effects , decreased semantic clustering , and deficient patterns of item recall during list learning . Moreover, other studies suggest that encoding deficits may be the primary contributing factor to HIV-associated memory impairment .

There are two likely explanations for these discrepant findings. First, differences in the HIV-associated memory deficit profile between studies may be related to variable use of HAART. Many of the studies that found both encoding and retrieval deficits were either conducted in the pre-HAART era or included participants that had low HAART adherence , whereas studies that found encoding deficits included participants on HAART —although there are exceptions . Second, researchers who argue that HIV is generally associated with encoding and not retrieval deficits point out that recall and recognition likely reflect discrepant processes. Specifically, recall appears to be the product of recollection while recognition seems to be comprised of both recollection and familiarity and it may be misleading to infer a retrieval deficit from recognition-recall discrepancies when recognition performances are greater than recall performances. That is, the recognition-recall discrepancies may be due to poor initial encoding rather and a true retrieval deficit. Cocaine is a potent stimulant that elevates synaptic levels of dopamine , norepinephrine , and serotonin , binds to DA, NE, and 5-HT transporters, and blocks neurotransmitter reuptake, although most of the literature has examined the effect of cocaine on the mesocorticolimbic DA pathway, since it has been posited to play an important role in addiction . Elevated DA levels lead to increased D1 and D2 receptor signaling and subsequent intracellular signaling pathways associated with G proteins . G proteins impact cyclic AMP-dependent protein kinase, which influences ion channels, vesicles, receptor expression, and glutamine neurotransmitters . These changes lead to increased excitability of the prefrontal cortex and are thought to account for the effects of cocaine . Neuroimaging studies have implicated structural, metabolic and white matter changes related to long-term use of cocaine , although recent critical review has questioned the methodology of these studies . Regarding the neuropsychological profile of cocaine use, acute effects of cocaine include enhanced response inhibition and psychomotor speed while chronic use is associated with deficits in processing speed , attention and working memory ,indoor grow racks executive functioning , and verbal memory . Studies examining abstinence from cocaine suggest at least partial reversal of cognitive deficits with 1 year of abstinence with residual deficits in reversal learning and emotional processing ; however, a recent systematic critical review by Frazer, Richards, and Keith warns against over interpreting group differences between cocaine users and non-users on cognitive measures and neuroimaging, as they may not indicate clinically significant discrepancies despite being statistically significant. Cocaine use in HIV has a synergistic impact on neuroimmune functioning, particularly the dopaminergic neurotransmission. While cocaine blocks DA transporters and acts as a reuptake inhibitor, the Tat protein of HIV functions as an allosteric modulator of DA transporter .

Consequently, in both cocaine use and HIV, elevated synaptic DA increases macrophages lead to neuroinflammation and accelerates the production of platelet monocyte complexes, which is related to HIV-associated neurocognitive disorder . Moreover, both cocaine use and HIV, directly and indirectly, impact the N-methyl-D-aspartate receptor , which represents a significant aspect of reward circuitry in the brain and it has been posited that changes NMDAR secondary to HIV infection may reinforce psycho stimulant abuse and addiction . In fact, HIV infected individuals have a stronger preference for the immediate effects of stimulant drugs over alcohol and nicotine . A recent neuroimaging study by Meade et al. of active cocaine use in HIV during intertemporal decision-making task revealed that cocaine use moderated the effects of HIV with clustered activations in the bilateral prefrontal cortices and cerebellum. Independently, cocaine use was associated with lower activation in bilateral frontal gyri and right insular and posterior parietal cortices while HIV was associated with higher activation in the visual cortex and reduced activation in bilateral prefrontal cortices and cerebellum and left posterior parietal cortex . From a neurocognitive perspective, as mentioned previously, chronic use of cocaine has a striking resemblance to the cognitive dysfunction associated with HIV, with overlap in impairments in attention, processing speed, verbal memory, and executive functioning. Of the aforementioned domains, verbal memory is a complex cognitive process that requires multiple abilities working in tandem and warrants further inspection—namely, dissecting what aspects of verbal memory is impacted . While some studies have found an impact on verbal memory , these finding may be impacted by reduced executive functioning and processing speed related to the synergistic impact of both cocaine and HIV. Specifically, Meade, Towe, Skalski, and Robertson found main effects for processing speed and executive functioning in HIV cocaine users with no significant effects for learning or memory. Traditional metrics of verbal memory of maybe confounded by attention and executive ability , which are domains that are dependent on the integrity of frontostriatal networks impacted by both HIV and cocaine use . Encoding metrics on verbal memory list tasks include learning slope, total correct on Trial 1, and other contrasts ; unfortunately, attentional difficulties are inherent in many neurological and psychiatric conditions can impact these scores . Similarly, consolidation metrics for verbal memory list tasks contrast total items recalled against recall following a delay with a decline signifying a loss of information or forgetting over time and another metric relies on proactive interference ; however, these metrics either provide non-specific findings of memory storage or do not account for initial learning . Retrieval metrics on verbal memory list tasks involve contrasting recall performances against recognition performances, although as mentioned previously, it may be misleading to infer a retrieval deficit from recognition recall discrepancies when recognition performances are greater than recall performances since these discrepancies may be due to poor initial encoding rather and a true retrieval deficit. The Item-Specific Deficit Approach is a psychometrically valid method that aims to mitigate the impact of inattention on encoding, consolidation, and retrieval on list-learning tasks . The ISDA was initially validated in a healthy comparisons sample, seropositive HIV, and traumatic brain injury , replicated in these populations , and has since been applied in other settings including amyotrophic lateral sclerosis , amnestic mild cognitive impairment , and Alzheimer disease . Most recently, Lueke and Lueke used the ISDA to evaluate the impact of mindfulness on encoding, consolidation, and retrieval. Generally, the ISDA has demonstrated superiority over traditional metrics, with one exception—Cattie et al. raised concerns about the incremental value of the ISDA and noted that the ISDA encoding index was collinear with other memory tests . In the current study, we attempted to ascertain the additive impact of recent and past cocaine use on the different aspects of verbal memory in a seropositive HIV sample using both traditional memory metrics and ISDA memory indices. Specifically, we examined if recent cocaine use impacted memory performances and also if a lifetime diagnosis of cocaine dependence or abuse impacted memory performances. We hypothesized that recent cocaine and not past use will adversely impact verbal memory given cocaine’s state-like effect on cognitive functioning . Additionally, a lifetime diagnosis of cocaine dependence or abuse and not subclinical use of cocaine the was hypothesized to adversely impact verbal memory given that some reach has posited at least some residual impact on brain structure .

How do you determine the optimal time for harvesting cannabis plants in a commercial setting?

Determining the optimal time for harvesting cannabis plants in a commercial setting is crucial for achieving the desired potency, flavor, and overall quality of the final product. The timing of the harvest is influenced by factors such as the strain of cannabis, the cultivation method, and the specific goals of the grower. Here are some general guidelines for determining the optimal time for harvesting cannabis in a commercial setting:

  1. Flowering Stage: Cannabis plants typically enter the flowering stage after a vegetative growth period. The flowering stage is when the plant starts producing buds. The length of the flowering stage varies by strain.
  2. Trichome Development: One of the most reliable indicators of harvest readiness is the development of trichomes. Trichomes are tiny, resinous structures that contain cannabinoids and terpenes. To assess trichome development, use a magnifying tool (such as a jeweler’s loupe) to examine the trichomes on the buds. Harvesting is often recommended when the trichomes are cloudy or milky in appearance. Some growers prefer to wait until a portion of the trichomes turns amber for a more sedative effect.
  3. Pistil Color: Another visual indicator is the color of the pistils (hairs) on the buds. In the early stages of flowering, pistils are often white. As the plant matures, the pistils change color. For many strains, a good time to harvest is when a significant portion of the pistils has changed from white to an amber or brown color.
  4. Cannabinoid and Terpene Profile: The desired cannabinoid and terpene profile for the final product also influences the harvest time. Different strains have different optimal cannabinoid and terpene profiles,curing cannabis and growers may have specific goals for the effects and flavors they want to achieve.
  5. Flush Period: Before harvesting, many growers implement a “flush” period, during which the plants are given only water to remove any residual nutrients. This helps improve the flavor and quality of the final product.
  6. Environmental Conditions: Consider the environmental conditions during the harvest. Harvesting during the early morning or late evening when temperatures are cooler can help preserve terpenes and prevent excessive moisture buildup.
  7. Strain-Specific Recommendations: Some strains have specific harvest windows recommended by breeders or experienced growers. Following these recommendations can be a good starting point.

It’s important for commercial cannabis growers to monitor and track the development of their plants regularly. Harvesting too early or too late can significantly impact the quality of the final product. Keep detailed records, experiment with small batches, and adjust your approach based on the specific characteristics of the strains you are cultivating. Additionally, compliance with local regulations regarding cannabis cultivation and harvesting is crucial for commercial operations.

What are the considerations for selecting the most suitable cannabis strains for indoor commercial cultivation?

Selecting the most suitable cannabis strains for indoor commercial cultivation involves considering various factors to ensure successful and efficient growth. Here are some key considerations:

  1. Strain Characteristics:
    • Indica vs. Sativa vs. Hybrid: Consider the characteristics of indica, sativa, and hybrid strains. Indicas are generally shorter and bushier, while sativas are taller with thinner leaves. Hybrids offer a combination of both.
    • THC and CBD Levels: Determine the desired levels of THC (tetrahydrocannabinol) and CBD (cannabidiol) in the final product. Different strains have varying cannabinoid profiles, affecting the psychoactive and therapeutic effects.
  2. Space and Growing Conditions:
    • Height and Size: Choose strains that fit well within the available vertical space. Some strains naturally grow taller,montel grow racks and managing height is crucial in indoor cultivation.
    • Climate Requirements: Consider the environmental conditions, such as temperature and humidity, that the strains prefer. Opt for strains that match the climate control capabilities of your indoor facility.
  3. Yield and Productivity:
    • Yield Potential: Assess the potential yield of each strain. High-yielding strains are often preferred for commercial cultivation to maximize output.
    • Grow Time: Consider the flowering time and overall life cycle of the strains. Some strains have shorter flowering periods, allowing for quicker turnaround between crops.
  4. Disease Resistance:
    • Resistance to Pests and Diseases: Select strains that are known for their resistance to common pests and diseases. This can reduce the need for pesticides and mitigate the risk of crop loss.
  5. Cultivation Complexity:
    • Suitability for Indoor Growing: Choose strains that adapt well to indoor cultivation. Some strains may require specific conditions or are better suited for outdoor growing.
    • Grower Expertise: Consider the level of expertise among your cultivation team. Some strains may be more forgiving for less experienced growers.
  6. Legal Considerations:
    • Compliance with Regulations: Ensure that the selected strains comply with local regulations. Some strains may have higher or lower THC levels, affecting their legal status in certain regions.
  7. Market Demand:
    • Consumer Preferences: Consider the preferences of the target market. Certain strains may be more popular among consumers, and growing in-demand strains can enhance market competitiveness.
  8. Testing and Monitoring:
    • Lab Testing: Regularly test and monitor the cannabinoid and terpene profiles of your strains. This ensures consistency in the final product and helps meet consumer expectations.
  9. Supplier Reputation:
    • Genetics Source: Choose reputable suppliers for your cannabis genetics. The quality of the seeds or clones can significantly impact the success of your cultivation.
  10. Budget Considerations:
    • Cost of Seeds or Clones: Consider the budget for acquiring seeds or clones. Some strains may be more expensive than others, impacting the overall cost of cultivation.

By carefully considering these factors, you can tailor your selection of cannabis strains to fit the specific requirements and goals of your indoor commercial cultivation operation.

What are the best practises for optimizing space utilizaion in a commercial indoor cannabis cultivation facility?

Immediately after the shock, the patient sustained burn marks and experienced short-term memory loss and fatigue. Three dayslater, the patient saw an internist and reported upper right quadrant pain, headaches, numbness, weakness, fatigue, insomnia, and minimal, first degree burn marks on his right underarm and on the dorsal aspect of both forearms. The surface area was 1.1% for each forearm, and an additional 1.1% for the right underarm, for a total affected area of 3.3%. One week later, the patient received MRIs of the lumbosacral spine, cervical spine and brain which all reported no abnormalities. One month later, the patient visited a psychologist regarding anxiety, insomnia, and depression, and was diagnosed with post-traumatic stress disorder and retrograde amnesia. Three months after the electrical injury, the patient saw an ophthalmologist regarding pain behind his right orbital and ‘‘drooping” of the right side of his face; he was diagnosed with Bell’s palsy. Two years after the incident, the patient had an orthopedic evaluation for right side body pain, loss of right hand motor control, right hand tremors, pain behind the right orbital and headaches with no orthopedic abnormalities found. The following day, the patient visited a neurologist and a different ophthalmologist regarding the same symptoms, with no abnormalities found. Three years after the electrical injury,cannabis grow racks the patient visited a neurologist regarding hypesthesia in the right side of the face and to pinpricks to the right hand, severe pain in the right arm and hand, moderate pain in the left arm and hand, and was diagnosed with electrocution neuropathy.

Five months later, the same neurologist noted improvement of the pain in the right arm and hand area. During the same year, the patient visited a therapist and was diagnosed with PTSD, severe anxiety, and situational depression and was prescribed psychotherapy as treatment. Six years after the injury, additional documentation of the damage sustained from the electrical injury was needed to provide objective evidence as part of a lawsuit against the electric company responsible for the exposed wires. The patient visited our laboratory for an MRI DTI and quantitative volumetric analysis, and a clinical neuropsychologist for an exam. At the time of the neuropsychological exam, the patient was taking Bupropion XL, Clobex, Hydrocodone/Acetaminophen, melatonin, Klonopin , Namenda , Neurontin , and medical marijuana. On the Diller-Weinberg Test, the patient missed 39/47 stimuli, and his visual encoding/processing speed on specific Wechsler Adult Intelligence Scale sub-tests was between the 1st and 5th percentile. On the dominant finger tapping test, the patient scored in the 5th percentile. His performance on a timed task of fine motor dexterity was impaired between 2 – 3 standard deviations below the mean, and his motor and processing speed index was in the 2nd percentile, which is typical residual of electrical injury. The patient scored 20 less points on his Performance intelligent quotient than his Verbal IQ , which is statistically significant and notably unusual. He scored as severely depressed on his Beck, and has had severe chronic pain and PTSD symptoms in the clinical range. The patient’s past medical history was significant for meningitis at age 10, and arthritis and hypertension as an early adolescent. The patient underwent several unrelated orthopedic surgeries from sports related injuries, with the last surgery being sixteen years before the electrical injury. According to his ex-fiancé, the patient was very social and outgoing before the electrical shock, while he became withdrawn and isolated afterwards.

The patient enjoyed activities such as surfing, swimming, hiking, basketball, and skateboarding, all of which he was unable to do, or did differently, after the injury. At the time of the incident, he was in good health and working as a physical trainer.In the presence of an external electric field, cell membrane permeabilization occurs as the lipids in the lipid bilayer undergo reorganization in a process known as electroporation. In turn, cell contents such as ions are able to move freely in and out of cells. Through the phenomenon of electroporation, current is able to travel through and leave the body through the second contact point to a grounding source. Clearly these aspects of EI are quite mechanistic, however, one of its enigmas include the remote neuropsychological deterioration of the patient regardless of the trajectory of the current . EI has been known to cause a spectrum of neuropsychological and psychiatric disorders. Duff compiled a review of twenty eight studies of EI and lightning injury patients, logging 2738 victims reporting a total of 4441 signs or symptoms. These signs/ symptoms were ‘‘categorized into nine different domains of sequelae, which included disturbance of consciousness, attention/concentration deficits, speech/language deficits, sensory deficits, memory deficits, other cognitive deficits, psychiatric complaints, somatic complaints, and neurological complaints”. Another study of the long-term sequelae of low-voltage electrical injury done by Singerman reported numbness, weakness, and memory problems as the most frequent neurological problems and anxiety, nightmares, insomnia, and flashbacks of the event as the most frequent psychological problems. Since the literature suggests EI causes neuropsychological sequelae, it is worth using MRI imaging techniques to examine any structural abnormalities and cerebral lesions. Irregularities observed on MRI scans are generally unique to each EI case, however white matter hyperintensities found on fluid-attenuated inversion recovery image sequences are a common factor.

The latter three of the case studies cited all report WMH specifically in the cerebral corticospinal tract. EI has also been known to cause hypoxia, which is characterized by cytotoxic edema in the cortex of the central region and the basal ganglia.The average lamppost in a densely populated city, such as New York City, works on a single-phase 120 V/240 V 60 Hz, AC received from a nearby three-phase generator. The patient received an electrical shock after submerging his hands in a puddle on a sidewalk charged with stray voltage from a nearby lamppost. Workers from the electrical company in the area testified that exposed ends of an electrical cable of a lamppost were causing 8 V of stray voltage. Using the information we know about wet skin resistance, we can also assume that the patient’s hand had a resistance of 1000 X, while the patient’s internal body had a resistance of 300 X. Rearranging Eq. , we calculate the current passing through the patient’s hand to be approximately 8 mA, while the current passing through the internal body is approximately 26 mA. However since salt water is more conductive than pure water, this would have potentially lowered the resistivity of the patient’s hand, causing the current passing through his hands to be comparatively higher and thus accounting for the no-let-go phenomenon he experienced. To examine the validity of this approximation, we consider the patient’s dog that went into seizure upon stepping in the charged puddle. A study done by Woodbury investigated the stimulus parameters needed to induce electroshock seizures on rats, and found that at 60 Hz AC, the current needed to promote seizures was 17.7 mA. This is extremely similar to the current needed, 16 mA, to induce the no-let-go phenomenon in the average male. Thus we can assume with substantial confidence that the current passing through the patient’s hand was roughly around 16 mA AC.At the time of the neuropsychological exam,commercial grow racks the patient was taking multiple medications that could have potentially affected cognitive performance. An investigation of these potential effects was conducted. Depressed patients treated with Bupropion scored similarly to normal, healthy controls on neuropsychiatric tests that assessed verbal memory, visual memory, finger tapping, and symbol digital coding. On the dominant finger-tapping test, our patient scored in 5th percentile, while on the coding sub-test, he scored in the 10th percentile. The patient’s visual and verbal memory scores were average. In a study that assessed the neuropsychiatric effects of Hydrocodone, subjects that had taken hydrocodone performed 10% worse than the mean on the motor performance test, while no variance was found on simple and complex reaction time tests. Our patient scored in the 2nd percentile on the motor and processing speed index. In a study done on 38 patients taking Clonazepam, 8 patients experienced behavioral side effects while 30 patients did not . The mean absolute discrepancy between VIQ and PIQ of the 8 patients was 17.5 points, while the discrepancy between VIQ and PIQ of the 30 patients who did not experience behavioral side effects was 6.5 points. Our patient’s VIQ and PIQdifference was 20 points. No study has been done on the effects of memantine on cognitive behavior for patients without Alzheimer’s disease , but for patients with AD, memantine improved language and memory scores in comparison to a placebo group.

Gonzalez measured the effects of cannabis on cognitive performance by determining overall indexes of neuropsychological performance and running individual neuropsychological tests . Habitual cannabis users performed 1/5th a standard deviation worse than controls in overall index scores, and had performed significantly worse on memory tests. The patient’s performance on memory tests and his full scale IQ were rated average. No effects of melatonin on neurocognitive performance were found . No effects of gabapentin on neurocognitive performance were found.Methamphetamine is a highly addictive, psychomotor stimulant. It is estimated that approximately 1.1 million Americans 12 years or older meet criteria for methamphetamine use disorder and 205,000 individuals initiated methamphetamine use in 2018 . Methamphetamine is associated with hyper sexuality, and its use is predictive of riskier sexual behaviors such as higher frequencies of unprotected sexual intercourse , as well as needle sharing , which can lead to increased risk of HIV-transmission , among other infections. While engagement in substance use treatment may decrease sexual risk behaviors and subsequent adverse health outcomes, less focus has been placed on precipitating and perpetuating factors of methamphetamine use and sexual risk behaviors. For example, loneliness is linked to increased sexual risk behaviors in the general population . Loneliness is a common human experience with nearly half of Americans reporting feeling lonely “sometimes” to “always” . Loneliness is defined as a feeling that accompanies the perception that one’s social needs are not being met by the quantity, or especially the quality of one’s social relationships . Thus, loneliness is the perception of being alone, rather than objective social isolation. It has been linked to a myriad of adverse mental and physical health outcomes including depression , anxiety , anger , suicide , cognitive decline , Alzheimer’s disease , poor cardiovascular health , and type II diabetes . These negative health outcomes are a consequence of, and/or exacerbated by, poor health behaviors that may arise from loneliness. Mechanistically, feeling alone is instinctually related to feelings of being unsafe, which in turn increases sympathetic activation, according to one loneliness model . Chronic hypervigilance, coupled with cognitive biases that the world is a threatening place and other negative social expectations, may lead to behaviors that further isolate and exacerbate loneliness . Being engrossed in this self-fulfilling prophecy has significant impacts on health-related behaviors. Furthermore, emotion regulation as well as other types of self-control behaviors become compromised when someone feels lonely . Inadequate self-regulation may contribute to the relationship between loneliness and substance abuse. This relationship is likely bidirectional: some individuals may self-medicate with methamphetamine use in response to distressing feelings of loneliness , whereas others may first engage in methamphetamine use and subsequently find themselves unable to participate in activities that maintain positive social relationships, leading to feelings of social isolation . Intuitively, this feedback loop between methamphetamine use and loneliness could have direct or indirect effects on increased sexual risk behaviors and successive adverse health outcomes. Previous work has shown that loneliness and methamphetamine use are independently associated with riskier sexual behaviors. Loneliness, together with methamphetamine use, may confer additive risk for engaging in riskier sexual behaviors. That is, a lonely individual who turns to methamphetamine to cope with feelings of loneliness may be more likely to engage in riskier sexual behaviors, given the hyper sexuality and impulsivity that accompany methamphetamine use. Moreover, a methamphetamine user whose social network has eroded to a point of experiencing loneliness may lack the opportunities to engage in safer alternatives to risky sex. Generally, attitudes and norms about health behaviors are linked to concurrent and future intentions, and engagement in those health behaviors including sexual risk behaviors . Individuals who use methamphetamine, and perhaps particularly those who are lonely, may have different assessments of risk and consequences in relation to safe sex than those who do not use methamphetamine. If true, addressing beliefs and intentions to practice safer sex in this particularly vulnerable population may be an important treatment focus with critical public health implications.

Cultivate Excellence: Indoor Cannabis Grow Systems for Premium Yields

Support for screening includes favorable feasibility studies,favorable attitudes by parents and adolescents toward suicide risk screening in general EDs,and the potential to identify adolescents with previously unrecognized suicide risk who are receiving no MH services.Although suicidal ideation is a well-established suicide risk factor,study findings suggest that screening questions about current suicidal ideation or a recent suicide attempt are insufficient if used as the sole triage or “go-no go” questions for determining whether or not a youth may be at risk for suicide. As seen in this study and others, not all individuals who make suicide attempts report current suicidal ideation.Furthermore, suicidal ideation is only a modest predictor of suicide attempts within clinical samples of adolescents and has failed to predict suicide attempts among adolescent males in the year following their psychiatric hospitalization.A computerized adaptive screen, which is under development in the ED-STARS study,may be more effective in identifying the full range of youth at risk for suicidal behavior. Further research is also recommended to examine the longitudinal trajectories of youth who report a past history of suicide risk only. They may represent a subgroup that denies current problems for fear of intervention, being stigmatized by self or others, or a loss of freedom if hospitalized. Our study sample was comprised entirely of adolescents who were known to be at elevated risk for suicide based on previously identified risk factors. Nevertheless,wholesale vertical grow factories the fact that 46% of the sample reported a history of multiple suicide attempts is striking.

The rate varied from a low of 34% for the HX-STB profile to a high of 71% for the S-STB+AGG profile. The overall high rate for this sample is consistent with that documented by a recent study that sampled psychiatrically hospitalized adolescents and reported a similar multiple attempt rate of 53%.It is also notable that sexual and physical abuse characterized a significant minority of adolescents who fit each of the five identified profiles, which is consistent with results from a recent meta-analysis of the association between childhood maltreatment and suicide risk.These findings suggest that, although profile characteristics varied, all youth in our sample were at elevated risk, pointing to the importance of better understanding issues of mental health service utilization. In this study, race was differentially distributed across latent classes. There were small proportions of Black adolescents in S-STB+AGG and S-STB classes, which were the classes most likely to have a history of MHSU and to present to the ED with a psychiatric chief complaint. Thus, the difference in MHSU by race parallels the difference in the distribution of latent class profiles for MHSU. The disparities in MHSU for minority groups have been well documented.It is important to understand the factors that influence clinicians’ and caregivers’ decisions on the need to use mental health care for their patients and children, and barriers to access these services. This study had multiple strengths including its large sample size; the recruitment of adolescents from pediatric EDs in PECARN, which were characterized by geographic, racial/ethnic, and economic diversity; and the broad range of risk factors available for LCA. Findings should be considered, however, within the context of study limitations. This study was conducted in the pediatric EDs of large academic health systems, which are not representative of all medical EDs, including those of smaller community hospitals. In addition, because we were assessing a broad range of risk factors and working within the time and space constraints of EDs, with a consideration of patient burden, many risk factors were assessed with brief, adapted scales.

In addition, the choices we made in limiting variables for the LCA may differ from those of other investigators, as it is possible to examine multiple iterations. Although more than half of the adolescents who screened positive for suicide risk did not present to the ED with a psychiatric chief complaint, it is possible that some of them had another chief complaint yet did receive psychiatric help. Also, if we only rely on lifetime MHSU, it is not possible to know if these services occurred before or after the STB. We also do not have information about diagnosed psychiatric disorders, which could be expected to impact MHSU. Finally, our LCA profile descriptors use simple summaries to capture multidimensional concepts and do not perfectly characterize each individual within those groups. For example, a non-negligible proportion of adolescents in the HX-STB class had multiple suicide attempts. Addressing the heterogeneity of clinical presentations among adolescents at elevated risk for suicide attempts, we identified five profiles of adolescents at risk with differing patterns of risk factors. MHSU was relatively common among adolescents characterized by the profiles with recent and severe suicidal thoughts and behavior, with or without aggression. However, MHSU was much less common among adolescents who only reported a history of suicidal thoughts and behavior, despite the fact that many of these youth had a lifetime history of multiple suicide attempts and/or other known suicide risk factors. MHSU was also lower among adolescents from racial and ethnic minority groups. In addition to implementing effective strategies for the recognition of suicide risk, this suggests the importance of facilitating treatment engagement and retention. Some of the strategies found to be helpful include the incorporation of motivational interviewing principles and attention to family stress, family coping, and broader family system issues.Care navigators and matching the race and ethnicity of clinical providers and families may also be helpful.

Finally, a recent review of 50 randomized controlled trials examining the effectiveness of treatment engagement interventions for child mental health services concluded that specific interventions can improve engagement and work across youth with varying racial and ethnic identifications, and mental health problems.In this the largest qualitative study among South Africans who smoke heroin, we found that trajectories to smoked heroin use were heavily influenced by social and structural factors. Similar to transition to injection heroin use in other settings , participants’ friends, peers, drug merchants, and others who were using and/or selling heroin figured prominently in the initiation narratives of those who started smoking heroin in this context. For example, participants’ social contacts distributed heroin to participants, glamorized its use, and/or encouraged its use . Participants seemed to exhibit a “peer preference” and assorted with like-minded peers, some of whom were already smoking heroin . The impact of social and structural factors could also be observed among participants who reported initiating heroin as a means to manage psychosocial distress. For example, “walking through the squatter camp” or otherwise being exposed to environments where people were smoking marijuana influenced these participants’ trajectories. The concept of a “risk environment” is a useful framework to evaluate HIV risk among people who inject drugs , and applied here captures the dynamic interactions between practices, places, people, beliefs,wholesale vertical grow factory and other social and structural factors surrounding the people who transitioned to smoked heroin in our study. Although prior marijuana use was ubiquitous among both participants with vertical and horizontal trajectories, social and structural influences on initiation narratives were also ubiquitous. According to the Gateway Hypothesis , smoking marijuana may have been facilitative to smoking heroin. However, rather than marijuana use is a critical stage in a sequence to smoking heroin, our findings suggest that the connection between smoking marijuana and smoking heroin may be enhanced by overlapping or shared risk environments. Both marijuana and heroin are smoked in a similar cigarette form and used or distributed by people who share the same spaces. Exposure to the social context of marijuana and heroin use, and not exposure to marijuana itself, was a critical event in the trajectory toward smoked heroin use. Altogether these data support a growing body of literature demonstrating that social and structural forces are important mediators of substance use initiation . Accordingly, describing trajectories as vertical and horizontal in this context is not meant to imply a sequence to drug use but instead to characterize different groups at risk for smoked heroin initiation. For example, several participants with horizontal trajectories reported trying heroin to alleviate the negative effects of hard drugs and heavy alcohol use. Although the use of heroin to “facilitate the descent” from crack cocaine has been described in Senegal , ours is the first report of the initiation of heroin to moderate the intoxicating effects and/or paranoia from stimulant use and as a means to stop drinking alcohol. Participants often started smoking heroin under these circumstances upon the advice of their social contacts.

Because of a high prevalence of poly substance use among heroin users throughout Africa , these types of horizontal trajectories underscore the importance of understanding knowledge and attitudes toward substance use treatment and providing access to comprehensive treatment services. Our findings are largely consistent with what has been observed in other heroin epidemics across Africa. Vertical trajectories of smoked heroin initiation have been similarly described in Tanzanian youth who are introduced to heroin-laced cigarettes . Economic pressures, rapid modernization, and parental mortality in the HIV epidemic have been implicated in disrupting family structures and leaving youth vulnerable to drug use . Limited opportunities to participate in the formal economy bring youth and adults into informal spaces where they can be exposed to drug use . In the context of stress and financial hardship, drug use may be an appealing means to cope, and drug dealing may be a means to make a living in the informal marketplace . The similarities between South Africa and Tanzania are worrisome because, if the heroin epidemic in South Africa follows the same path, South Africa may also experience a marked expansion of injection heroin use. In Tanzania, the less refined, “brown” heroin was primarily smoked in the 1980s and 1990s . From 1998–2003, a variety of factors fueled an increase in injection use: 1) the high from smoking heroin waned with repeated exposure to the drug, 2) the more refined, “white” heroin became available, and 3) the tools for injection became more widely available . The absence of a similar widespread transition in South Africa may be due to the availability of white heroin and/or the “technology” of injection drug use. In addition to monitoring substance use treatment statistics and epidemiological surveys, following inventories of heroin seizures, reviewing reports of healthcare utilization for injection-related health problem, and conducting periodic qualitative research examining pathways of initiation among heroin users could also reveal trends in injection drug use. If there is a significant shift in the number of people smoking heroin who transition to injection, HIV incidence will likely also increase. Additional surveillance can inform efforts to scale up needle and syringe programs and medication-assisted treatment with methadone or buprenorphine. Our data also suggest that smoked heroin may be subject to specific social and structural forces that inhibit initiation of injection use. Unlike in Tanzania , people initiating smoked heroin in South Africa do not appear to progress rapidly to injection drug use. In fact, smoked heroin and injection heroin use may occur in distinct risk environments. According to participants, these risk environments diverge along racial lines. Despite efforts to dismantle apartheid-related policies of racial segregation in South Africa, racial disparities persist post-apartheid . Resultant social and structural inequalities affect economic and health outcomes , and societal transitions may also influence drug use trends in communities and populations . Research comparing people who inject heroin with those who smoke heroin may provide additional insight into “ecological containment” or other aspects of the risk environment that might explain group differences in drug use . Research of this nature will also inform the development of interventions specifically to address social and structural determinants of health affecting people who smoke heroin and their communities . In the context of these differences, it is difficult to compare our findings to research on injection heroin use. For example, in Tanzania it is reported that people initiating others into injection heroin use were generally older than the initiates , but the presence of disparate age relationships between initiators and initiates of smoked heroin was not described in our narratives. Additionally, it may be difficult to apply interventions targeting injection drug use to this population. Nevertheless, similarities between to social and structural influences on initiation of smoked and injection heroin use suggest approaches to prevent injection heroin and other substance use may be relevant in this context.

Cultivating Cannabis: A Comprehensive Guide for Beginners

This study suggests that in middle-aged PWH without severe confounding medical conditions and high rates of ART use, there is not a greater than expected decline in delayed recall. However, more research is needed to more definitively determine if there is accelerated memory decline in middle-aged PWH. Lastly, while there was some indication that peripheral CRP may be associated with memory, overall, most biomarkers of inflammation were not associated with episodic memory and the medial temporal lobe did not mediate a relationship between inflammation and episodic memory. However, given the limitations described above, ongoing research on this topic is needed. In summary, this study found that memory may be more related to HIV disease than preclinical AD, and delayed recall did not significantly decline over several years. This is positive news given that HIV-associated neurocognitive impairment is usually non-progressive. However, more research is needed in older PWH, when aMCI/AD would be more expected. Brief interventions have empirical support for acutely reducing alcohol use among non-treatment seeking heavy drinkers. For example, randomized clinical trials of brief interventions have found favorable results among heavy drinkers reached through primary care , trauma centers and emergency departments . Brief interventions also have shown effectiveness in reducing alcohol use in non-medical settings among a young adult college population . Given this sizable evidence base,vertical grow racks system there is considerable interest in understanding the underlying mechanisms toward optimizing this approach.

Neuroimaging techniques allow for the examination of the neurobiological effects underlying behavioral interventions, probing brain systems putatively involved in clinical response to treatment. To date, one study has examined the effect of a motivational interviewing-based intervention on the neural substrates of alcohol reward . In this study, neural response to alcohol cues was evaluated while individuals were exposed to change talk and counterchange talk , which are thought to underlie motivation changes during psychosocial intervention. The authors report activation in reward processing areas following counter change talk, which was not present following exposure to change talk . Feldstein Ewing and colleagues have also probed the nature of the origin of change talk in order to better understand the neural underpinnings of change language . In this study, binge drinkers were presented with self-generated and experimenter-selected change and sustain talk. Self-generated change talk and sustain talk resulted in greater activation in regions associated with introspection, including the interior frontal gyrus and insula, compared to experimenter elicited client language . These studies employed an active ingredient of MI within the structure of the fMRI task, thus allowing for a more proximal test of treatment effects. Neuroimaging has also been used to explore the effect of psychological interventions on changes in brain activation that are specifically focused on alcohol motivation. For example, cue-exposure extinction training, a treatment designed to prevent return to use by decreasing conditioned responses to alcohol cue stimuli through repeated exposure to cues without paired reward, has also been evaluated using neuroimaging . Alcohol dependent patients who underwent cue-exposure extinction training had larger decreases in neural alcohol cue-reactivity in mesocorticolimbic reward circuitry than patients who had standard clinic treatment.

Cognitive bias modification training, which similarly trains individuals to reduce attentional bias towards alcohol cues, resulted indecreased neural alcohol cue-reactivity in the amygdala and reduced medial prefrontal cortex activation when approaching alcohol cues . These studies suggest that fMRI tasks may be sensitive to treatment response. Further, neurobiological circuits identified using fMRI can be used to predict treatment and drinking outcomes, providing unique information beyond that of self-report and behavior. Individuals with alcohol use disorder who return to use demonstrate increased activation in the mPFC to alcohol cues compared to individuals with AUD who remain abstinent . Moreover, the degree that the mPFC was activated was associated with the amount of subsequent alcohol intake, but not alcohol craving . Activation in the dorsolateral PFC to alcohol visual cues has been associated with higher percent heavy drinking days in treatment-seeking alcohol dependent individuals . Increased activation in the mPFC, orbitofrontal cortex, and caudate in response to alcohol cues has also been associated with the escalation of drinking in young adults . Mixed findings have been reported for the direction of the association between cue-induced striatal activation and return to use. Increases and decreases in ventral and dorsal striatal activation to alcohol cues have been associated with subsequent return to use. Utilizing a different paradigm, Seo and colleagues found that increased mPFC, ventral striatal, and precuneus activation to individually tailored neutral imagery scripts predicted subsequent return to use in treatment-seeking individuals with AUD . Interestingly, brain activity during individually tailored alcohol and stress imagery scripts was not associated with return to use .

While initial evidence indicates that psychological interventions are effective at reducing mesocorticolimbic response to alcohol-associated cues, few studies have prospectively evaluated if psychosocial interventions attenuate neural cue-reactivity that in turn reduces drinking in the same population. Furthermore, no previous studies have used neural reactivity to alcohol cues to understand the mechanisms of brief interventions. Therefore, this study aimed to examine the effect of a brief intervention on drinking outcomes, neural alcohol cue-reactivity, and the ability of neural alcohol cue-reactivity to predict drinking outcomes. Specifically, this study investigated: 1) if the brief intervention would reduce percent heavy drinking days or drinks per week in non-treatment seeking heavy drinkers in the month following the intervention and 2) if the brief intervention would attenuate neural alcohol cue-reactivity. In the first case, we predicted significant effects on drinking based on the existing clinical literature and, in the second case, we predicted decrements in alcohol’s motivational salience based on the feedback about the participant’s drinking levels relative to clinical recommendations and their personal negative consequences of drinking. The effects of neural cue reactivity on subsequent drinking outcomes were tested in order to elucidate patterns of neural cue-reactivity that predict drinking behavior prospectively.Participants were recruited between November 2015 and February 2017 from the greater Los Angeles metropolitan area. Study advertisements described a research study investigating the effects of a brief health education session on beliefs about the risks and benefits of alcohol use. Inclusion criteria were as follows: engaged in regular heavy drinking, as indicated by consuming 5 or more drinks per occasion for men or 4 or more drinks per occasion for women at least 4 times in the month prior to enrollment ; a score of ≥8 on the Alcohol Use Disorder Identification Test. Exclusion criteria included under the age of 21; currently receiving treatment for alcohol problems, history of treatment in the 30 days before enrollment, or currently seeking treatment; a positive urine toxicology screen for any drug other than cannabis; a lifetime history of schizophrenia, bipolar disorder,vertical grow solution or other psychotic disorder; serious alcohol withdrawal symptoms as indicated by a score of ≥10 on the Clinical Institute Withdrawal Assessment for Alcohol-Revised; history of epilepsy, seizures, or severe head trauma; non-removable ferromagnetic objects in body; claustrophobia; and pregnancy. Initial assessment of the eligibility criteria was conducted through a telephone interview. Eligible participants were invited to the laboratory for additional screening. Upon arrival, participants read and signed an informed consent form. Participants then completed a series of individual differences measures and interviews, including a demographics questionnaire and the Timeline Follow-back to assess for quantity and frequency of drinking over the past 30 days. All participants were required to test negative on a urine drug test . A total of 120 participants were screened in the laboratory, 38 did not meet inclusion criteria and 12 decided not to participate in the trial, leaving 60 participants who enrolled and were randomized. Of the 60 individuals randomized, 46 completed the entire study. See Figure 1 for a CONSORT Diagram for this trial.The study was a randomized controlled trial. Participants were assessed at baseline for study eligibility and eligible participants returned for the randomization visit up to two weeks later. During their second visit, participants completed assessments, and then were were randomly assigned to receive a 1-session brief intervention or to an attention-matched control condition. Immediately after the conclusion of the session participants completed a functional magnetic resonance imaging scan to assess brain activity during exposure to alcohol cues and completed additional assessments. Participants were followed up 4 weeks later to assess alcohol use since the intervention through the 30-day Timeline Follow back interview. Participants who completed all study measures were compensated $160. The brief intervention consisted of a 30–45 minute individual face-to-face session based on the principles of motivational interviewing .The intervention adhered to the FRAMES model which includes personalized feedback , emphasizing personal responsibility , providing brief advice , offering a menu of change options, conveying empathy , and encouraging self-efficacy . In accordance with MI principles the intervention was non-confrontational and emphasized participants’ autonomy.

The content of the intervention mirrored brief interventions to reduce alcohol usethat have been studied with non-treatment seeking heavy drinkers. The intervention included the following specific components: 1) giving normative feedback about frequency of drinking and of heavy drinking; 2) Alcohol Use Disorders Identification Test score and associated risk level ; 3) potential health risks associated with alcohol use; 4) placing the responsibility for change on the individual; 5) discussing the reasons for drinking and downsides of drinking; and 6) setting a goal and change plan if the participant was receptive . The aim of the intervention was to help participants understand their level of risk and to help them initiate changes in their alcohol use. Sessions were delivered by master’s-level therapists who received training in MI techniques, including the use of open-ended questions, reflective listening, summarizing, and eliciting change talk, and in the content of the intervention. All sessions were audiotaped and rated by author MPK for fidelity and for quality of MI interventions using the Global Rating of Motivational Interviewing Therapists . On the 7-point scale, session scores ranged from 5.87 to 6.93 with an average rating of 6.61 ± 0.23, which indicates that the MI techniques used in the intervention were delivered with good quality. Supervision and feedback were provided to therapists by author MPK following each intervention session. The treatment manual is available from the last author upon request. Participants randomized to the attention-matched control condition viewed a 30-minute video about astronomy. In the control condition there was no mention of alcohol or drug use beyond completion of research assessments. Both the intervention and attention-matched control sessions took place within the UCLA Center for Cognitive Neuroscience in separate rooms from the neuroimaging suite.For the intervention effect on drinking, linear mixed model analyses were conducted to test for the main effect of the intervention on the average number of drinks per week and percent of heavy drinking days in the 4 weeks post intervention. One model was run for each dependent variable. The intercept was a random effect. The models accounted for sex, smoking status and age as covariates. The intervention effect was evaluated by testing the time -by-condition interaction. Comparative effect size estimates for the effect of intervention on drinking outcomes were calculated based on adjusted models using d = Bcondition*time /SDpooled baseline. In addition, the effects of neural cue-reactivity on drinking outcomes was also examined. For the analysis of the cues task, all first-level analyses of imaging data were conducted within the context of the general linear model , modeling the combination of the cue and taste delivery periods convolved with a double-gamma hemodynamic response function , and accounting for temporal shifts in the HRF by including the temporal derivative. Alcohol and water taste cues were modeled as separate event types. The onset of each event was set at the cue period with a duration of 11 seconds. Six motion regressors representing translational and rotational head movement were also entered as regressors of no interest. Data for each subject were registered to the MBW, followed by the MPRAGE using affine linear transformations, and then normalized to the Montreal Neurologic Institute template. Registration was further refined using FSL’s nonlinear registration tool . The Alcohol Taste > Water Taste contrast was specified in the first level models. Higher level analyses combined these contrast images within subjects and between subjects . Age, sex, cigarette smoking status, and positive urine THC were included as covariates. Additional analyses evaluated if neural response to alcohol taste cues was predictive of drinking outcomes.

Indoor Oasis: Creating the Ideal Environment for Cannabis Cultivation

One strategy to address this potential problem would be referral to a board-certified veterinary nutritionist to ensure any home-prepared diet is complete and balanced. An alternative strategy could be to discuss with the owner their concerns with commercial pet foods. Collecting a comprehensive nutritional history is not only important for ensuring dietary needs are met, but the conversation could lead to discussion regarding perceived problems of commercial pet foods. The current study did not find the accompanying decrease in commercial diets that has been shown elsewhere with the vast majority of owners using a commercial diet for part or all of their dog’s foods. As our sample comprised dogs with a recent diagnosis of cancer, this might suggest that inclusion of home-prepared elements precedes the complete exclusion of commercial diets, and our survey was conducted too close to the time of diagnosis to find exclusion of commercial diets. However, for owners feeding a commercial diet both before and after diagnosis, nearly half stopped feeding the pre-cancer diagnosis diet. It is possible that our sample would ultimately have stayed on their second commercial diet, rather than eliminating commercial elements entirely. Among owners feeding commercial diets, we found a decrease in the use of grain-free foods, from 22% to 14% among all 128 respondents, after a cancer diagnosis. While this could seem contrary to the concerns of some owners regarding the role of carbohydrate in promoting cancer progression,possible benefits of the low carbohydrate approach have not been supported by any studies. Further,vertical grow cannabis designs grain-free diets can be lower, similar, or higher in carbohydrate content compared to other diet categories.

There has been considerable attention to the association between dilated cardiomyopathy in dogs and the use of grain-free diets,and both veterinarians and pet owners might have increased awareness of this issue. Regardless, given that more than 1 in 5 dogs in the present study were fed a grain-free diet before a cancer diagnosis, this data highlights the need for clinicians to discuss the risk of diet-associated DCM with all dog owners. The most common informational resource for diets and supplements was veterinarians, similar to previous studies for dogs.Veterinarians are a key resource for providing nutritional information, especially after a cancer diagnosis when veterinarians are actively involved with care, and around three quarters of pet owners believe a change is necessary.Additionally, as our data show, many dog owners do alter their dog’s diet. These findings underscore the importance of collecting and assessing a thorough diet history. This enables effective client counseling by the veterinary care team to help guide and ensure the safe use of diets, treats, and supplement products. Our study did not differentiate whether veterinary advice was taken from general practitioners, cancer-specialists, nutritionists, or elsewhere. Further specifying where owners receive information in a future study would be beneficial for understanding whose dietary advice pet owners value the most. To assess which factors were most likely to result in diet changes, we created a logit model. Our logit model showed that 1 predictor of owners making diet changes was median census tract income, which lowers the chance of diet change as tract income increases. This suggests that people in wealthier areas might be less likely to alter their dog’s diet in response to a diagnosis of cancer. Larger studies are warranted to confirm and further investigate this pattern. One limitation of the current study was only involving dogs referred to a single hospital’s oncology service. Coupled with time restrictions, this survey might not have recruited a large enough sample size to detect all of the patterns in nutritional alteration after a cancer diagnosis.

Furthermore, dog owners within the geographical area of the survey might not be representative of the greater population of dogs and owners. Additionally, dog owners visiting oncology services are a subset of the overall dog owner population, meaning these data can only apply to dogs with a recent cancer diagnosis presenting for evaluation by a specialist. Any owners that decided not to pursue a second opinion or further treatment would not have visited the oncology service, and because of treatment associated costs, respondents to this survey could have more disposable income. This study sought to capture a single snapshot in time, namely, when a dog initially presented to an oncology service. We do not know if this sample of dogs would have eventually shown similar or different patterns than other studies, such as exclusion of commercial diets and using social media groups for dietary and supplement recommendations. It is also possible that these owners would either revert to previously fed diets and supplements or make more extreme changes after treatment. Although we attempted to capture the time-point shortly after diagnosis, there was still a median delay of 61 days from diagnosis to survey. This is likely due the nature of online survey distributions, and the wait to get an oncology appointment which was exacerbated by the pandemic. Additionally, some dogs attempted cancer-related treatments elsewhere before presenting to the oncology service. As a result, some dogs were already undergoing or finished treatments at the time of taking the survey, some of which might have caused gastrointestinal issues before survey completion. Nonetheless, we feel that the time frame from diagnosis to survey enables us to capture additional nutritional changes beyond those simply because of an immediate medical need such as cancer and treatment related gastrointestinal signs.

Further study is warranted into how specific treatments might result in changes to what owners feed their dogs. This study also tried to balance the quality and completeness of data obtained with respondents’ time and willingness to complete a lengthy survey. One concern was that adding too many questions would result in many owners not reaching the end of the survey. Since owners who made changes were asked additional questions, we felt these owners would disproportionately fail to reach the end of the survey, possibly skewing results. Another consideration in interpreting the results of this study was if owners who changed their dog’s diet or supplements could recall what was previously given. Based on initial piloting of the survey, some owners did not recall their dog’s previous diets and supplements and were frustrated by the survey. As a result, the survey program did not force a response for these questions. This was done to ensure owners who did not remember previous nutritional information would be able to complete the survey without guessing unknowns. While we feel this goal was achieved, it is also likely that some owners who remembered simply skipped past these questions for the sake of time. This study strived to be inclusive to all answers by providing text boxes, often referred to as “other” within the survey, if the owner felt the listed multiple-choice options for a question did not apply. However, as the owners largely filled out the survey online by themselves,vertical grow dry racks many either did not list what we were looking for, or possibly used the text box as an additional place to put information, rather than intending to respond with “other.” These factors limited the value of the free text responses, and we feel that studies in the future could avoid these issues by either limiting free text responses in favor of more comprehensive multiple-choice options or by administering the survey in person. Overall, many dog owners make alterations to diet or supplements after their dog has been diagnosed with cancer. Clinicians should counsel owners regarding cancer treatment and its relation to nutrition to assess the current diet and enable educated decisions for any changes. Topics of focus could include discussing owner concerns regarding commercial diets, formulation of home-prepared diets, and the use of certain herbal supplements, including mushrooms and CBD.Contrary to the hypothesis, medial temporal lobe structures were not significantly associated with odds of being impaired on recognition. Given the limited number of participants that were impaired on recognition, there may not have been enough power to detect an effect; however, the odds ratios were fairly close to 1 indicating the association was neither statistically nor clinically significant. Also contrary to the aim 1a hypothesis, a thinner pars opercularis, part of the prefrontal cortex, was significantly associated with greater odds of being impaired on recognition.

No other prefrontal regions or basal ganglia regions were significantly associated with odds of being impaired on recognition. Aim 1b examined the relationship between continuous delayed recall and the three regions of interest. Delayed recall was hypothesized to be more equally associated with all three regions, given that delayed recall deficits are observed in both aMCI/AD and HAND. Somewhat consistent with the hypothesis, thicker rostral middle frontal gyrus and pars opercularis were associated with better delayed recall. Examining laterality, these findings were somewhat more driven by the right. Additionally, thicker right pars triangularis was significantly associated with better delayed recall whereas the left pars triangularis was not. Contrary to the hypothesis, delayed recall was not significantly associated with the medial temporal lobe nor the basal ganglia. In post hoc analyses that excluded participants not on ART, or those with a detectable viral load or methamphetamine use disorder – a group of participants who are closer to those who are ideally treated in medical care – these associations held and thicker rostral middle frontal gyrus and pars opercularis were associated with better delayed recall and relationships were somewhat stronger within this subset of participants. It is important to note that given that delayed recall was examined continuously, this does not imply that these prefrontal regions are associated with delayed recall impairment, as that was not examined. Moreover, mean cortical thickness was included in the models as a covariate, so this means that this association is observed while accounting for average cortical thickness. Taken together, the finding that episodic memory was associated with some prefrontal structures may suggest that, at least in middle age, episodic memory performance is more likely related to frontally mediated etiologies, such as HIV, rather than early AD pathology. The inferior frontal gyrus, which includes the pars opercularis, pars triangularis, and pars orbitalis, as well as the middle frontal gyrus are not part of the medial limbic circuit implicated in memory formation, but they still contribute to memory deficits. The prefrontal cortex is of course associated with memory retrieval . Additionally, more recent models of memory formation stress the importance of the prefrontal cortex in memory formation given that there is some research to suggest that the prefrontal cortex aids in enabling long-term memory formation through connections with the anterior thalamic nuclei . Additionally, these more updated models of memory formation could account for why recognition was associated with prefrontal structures as well, although there could be several other explanations for this observed association. For example, recognition may also be associated with prefrontal structures due to poor initial encoding, which was not explicitly examined in these analyses. Nevertheless, functional MRI studies have shown alterations in prefrontal and hippo campal regions during memory tasks in PWH compared to controls further highlighting that prefrontal regions are implicated in memory in PWH . As highlighted in the introduction, HIV studies have found structural changes throughout the brain, including frontal regions, as compared to persons without HIV . Additionally, studies have demonstrated accelerated age-related atrophy or greater than expected “brain age” in middle-aged and older PWH compared to HIV-negative participants . For example, Milanini et al., 2019 found that, in a group of 19 participants with HAND who were on average 64 years old, HAND individuals showed faster atrophy in the cerebellum and frontal gray matter compared to HIV-negative controls. Additionally, Pfefferbaum et al., 2014 found accelerated changes in the frontal lobe, temporal pole, parietal lobe, and the thalamus in PWH compared to HIV-negative controls. Of these studies examining longitudinal brain changes, all found some involvement of the frontal lobe, but most studies did not examine the specific regions within frontal lobe that were driving these associations. Additionally, results from these studies were mixed as to if brain changes were associated with changes in cognition. Given that the current study only examines structural MRI at one time point, we cannot assume that there has been atrophy of the prefrontal cortex; however, given the literature demonstrates atrophic changes in PWH in the frontal lobe and accelerated aging in the frontal lobe, is possible that changes in the prefrontal cortex have occurred in this cohort and are contributing to the observed associations with memory.