Category: Research

This category is about topics that are currently being researched about and where key findings are shared.

  • Ontological Insecurity: The Path of Existential Anxiety, Uncertainty, and Depth

    Ontological Insecurity: The Path of Existential Anxiety, Uncertainty, and Depth

    Advertisements

    Ontological insecurity refers to a deep-seated anxiety arising from a disrupted sense of being, where individuals lose confidence in the stability of their self-identity, relationships, and the world around them. Coined by psychiatrist R.D. Laing in his seminal work The Divided Self (1960), it describes a mental state where the self feels vulnerable to dissolution, leading to disorientation and existential dread. Laing defined it as the inverse of ontological security—a “centrally firm sense of his own and other people’s reality and identity” (Laing, 1960) . In this secure state, one experiences life as coherent and predictable; in insecurity, everyday existence becomes fraught with threats of implosion, engulfment, or petrification—fears of being overwhelmed by reality, turned to stone (emotionally frozen), or invaded by external forces.

    Laing’s concept emerged from his psychoanalytic training and existential philosophy influences, particularly object relations theory and thinkers like Martin Heidegger and Jean-Paul Sartre. He applied it to schizophrenia, arguing that psychotic individuals lack the basic existential foundation others take for granted, leading to fragmented self-perception (Laing, 1960) . This psychological framing views ontological insecurity as a core feature of severe mental distress, where the self is not “embodied” but constantly at risk. Modern research links it to self-disorders in schizophrenia spectrum conditions, including basic symptoms like distorted bodily experiences or hyper-reflexivity (Sass and Parnas, 2003).

    Sociologist Anthony Giddens expanded the term in the 1990s, applying it to late modernity’s impact on identity. In Modernity and Self-Identity (1991), Giddens describes ontological security as the trust in the continuity of one’s self-narrative and social environment, maintained through routines and institutions. Ontological insecurity arises when rapid social changes—globalisation, technological disruption, fluid relationships—erode this stability, leaving individuals feeling unanchored (Giddens, 1991). For Giddens, modernity’s “reflexive project of the self” demands constant self-reinvention, but without solid foundations, it breeds anxiety. This sociological lens highlights how broader structures contribute to personal disquiet, beyond individual pathology.

    Causes of ontological insecurity are multifaceted. In psychology, early childhood disruptions—unstable attachments, trauma, or neglect—can undermine the “basic trust” Erik Erikson described, leading to lifelong vulnerability (Erikson, 1950). Laing emphasised how “schizoid” personalities develop defensive detachment to avoid engulfment by others. Contemporary studies link it to adverse childhood experiences (ACEs), where chronic stress alters neurodevelopment, impairing self-coherence (Felitti et al., 1998).

    Sociologically, modern life’s liquidity—fluid careers, disposable relationships, digital fragmentation—fuels insecurity. Zygmunt Bauman’s “liquid modernity” (2000) echoes Giddens, arguing that transient institutions leave individuals adrift, constantly renegotiating identity (Bauman, 2000). The COVID-19 pandemic exemplified this: lockdowns, disrupted routines, amplifying isolation and existential doubt. Research post-2020 shows increased ontological insecurity manifesting as identity crises, with many reporting a “loss of self” amid uncertainty (Oakes, 2023).

    Manifestations vary. Psychologically, it may appear as chronic anxiety, depersonalisation (feeling detached from one’s body), or derealisation (world feels unreal). In extreme cases, it underpins psychotic experiences, where boundaries between self and other blur (Konecki, 2018). Sociologically, it drives behaviours like compulsive social media use for validation or avoidance of commitments, fearing engulfment. Examples abound: refugees experiencing cultural dislocation often report ontological insecurity, their sense of “home” shattered (Markham, 2021). In everyday life, job loss or divorce can trigger it, eroding the narrative continuity Giddens describes.

    Impacts are profound. Ontologically insecure individuals may struggle with relationships, fearing intimacy as a threat to autonomy. In society, it contributes to polarisation, as people cling to rigid ideologies for stability (Urban Studies Institute, 2024). Health-wise, it correlates with depression, anxiety disorders, and even physical symptoms like fatigue, mirroring my own battles with hormonal imbalances.

    Coping strategies draw from both fields. Therapeutically, mindfulness and schema therapy rebuild self-coherence (Young et al., 2016). Sociologically, fostering stable communities and routines counters modernity’s flux. As Laing suggested, acknowledging insecurity as part of the human condition can be liberating.

    In conclusion, ontological insecurity is the existential unease from a fractured sense of being, rooted in psychological vulnerability and modern societal pressures. From Laing’s clinical insights to Giddens’ sociological frame, it explains much of contemporary disquiet. Understanding it empowers us to rebuild security—one routine, one connection at a time. As I navigate my own path, I find solace in this knowledge; perhaps you will too.

    References

    Bauman, Z. (2000) Liquid modernity. Polity Press. Available at: https://www.politybooks.com/bookdetail/?isbn=9780745624099 (Accessed: 10 March 2026).

    Erikson, E. H. (1950) Childhood and society. Norton. Available at: https://wwnorton.com/books/9780393310344 (Accessed: 10 March 2026).

    Felitti, V. J. et al. (1998) ‘Relationship of childhood abuse and household dysfunction to many of the leading causes of death in adults’, American Journal of Preventive Medicine, 14(4), pp. 245–258. Available at: https://www.ajpmonline.org/article/S0749-3797(98)00017-8/fulltext (Accessed: 10 March 2026).

    Giddens, A. (1991) Modernity and self-identity: Self and society in the late modern age. Polity Press. Available at: https://www.politybooks.com/bookdetail/?isbn=9780745609324 (Accessed: 10 March 2026).

    Konecki, K. T. (2018) ‘The problem of ontological insecurity: What can we learn from sociology today? Some Zen Buddhist inspirations’, Qualitative Sociology Review, 14(2), pp. 50–68. Available at: http://www.qualitativesociologyreview.org/PL/Volume42/PSJ_14_2_Konecki.pdf (Accessed: 10 March 2026).

    Laing, R. D. (1960) The divided self: An existential study in sanity and madness. Penguin Books. Available at: https://www.penguinrandomhouse.com/books/264434/the-divided-self-by-r-d-laing/ (Accessed: 10 March 2026).

    Markham, A. (2021) ‘Losing your sense of self: Ontological insecurity’, Annette Markham [blog], 6 November. Available at: https://annettemarkham.com/2021/11/losing-your-sense-of-self-ontological-insecurity (Accessed: 10 March 2026).

    Oakes, M. B. (2023) ‘Ontological insecurity in the post-covid-19 fallout: Using existentialism as a method to develop a psychosocial understanding to a mental health crisis’, Health Psychology and Behavioral Medicine, 11(1), pp. 1–15. Available at: https://pmc.ncbi.nlm.nih.gov/articles/PMC10425504/ (Accessed: 10 March 2026).

    Sass, L. A. and Parnas, J. (2003) ‘Schizophrenia, consciousness, and the self’, Schizophrenia Bulletin, 29(3), pp. 427–444. Available at: https://academic.oup.com/schizophrBull/article/29/3/427/1879716 (Accessed: 10 March 2026).

    Urban Studies Institute (2024) ‘Ontological insecurity in the modern world: Understanding its origins’, Urban Studies Institute, 21 July. Available at: https://urbanstudies.institute/urban-construct-development-dynamics/ontological-insecurity-modern-world-origins (Accessed: 10 March 2026).

    Young, F. (2016) A history of exorcism in Catholic Christianity. Palgrave Macmillan. Available at: https://link.springer.com/book/9783319291116 (Accessed: 10 March 2026).

  • The Infamous GCSE Question

    The Infamous GCSE Question

    Advertisements

  • I Stand Against The Modern Romanticisation of Pederasty, and Other Sexual Vicissitudes

    I Stand Against The Modern Romanticisation of Pederasty, and Other Sexual Vicissitudes

    Advertisements

    I lay in bed staring at the ceiling. Too many thoughts rush through my mind. Too many memories of injustices which might never end. A repertoire of traumas that I can only wish I could shake off. But I cannot; the scar that sexual abuse left in my life cannot be erased. It cannot be healed. It cannot be forgotten. It haunts me every day…

    Subscribe to get access

    Read more of this content when you subscribe today.

  • Ten (π∞) Ways to Measure Probability in Relation to an Incident

    Ten (π∞) Ways to Measure Probability in Relation to an Incident

    Advertisements

    Probability does not have to mean complicated math. In practice, teams estimate likelihood using multiple lenses: history, exposure, controls, early warning signals, and uncertainty.

    Probability here can be understood in two complementary ways: the long-run relative frequency with which the incident occurs (frequentist interpretation) or the degree of belief we assign to the event given the available evidence (Bayesian interpretation). Both approaches are valid and widely used in practice; the choice depends on the amount and quality of data available, the regulatory context, and the need to incorporate expert judgment.

    Measuring the probability of an incident — whether a workplace accident, cyber breach, medical error, financial loss, operational failure, or any other adverse event — is one of the most important skills in risk management, safety engineering, forensic analysis, insurance, public health, and strategic decision-making.

    1. Classical (A Priori) Probability

    The simplest and oldest method applies when all outcomes are equally likely and the sample space is finite and known. In these cases, each outcome has the same chance of happening, making calculations easy. Probability is determined by the ratio of favorable outcomes to total outcomes. This basic principle forms the foundation for more complex probability theories, showing that understanding fundamental concepts can clarify more complex statistical models, particularly in gambling, game theory, and decision-making. Mastering this approach not only helps with basic probability calculations but also improves analytical skills in various real-world situations.

    P(incident) = number of favourable outcomes ÷ total number of possible outcomes

    Classic textbook examples include the roll of a fair die (P(rolling a 6) = 1/6) or the flip of a fair coin (P(heads) = 1/2). In real incident analysis this approach is rarely sufficient because most real-world events do not have equally likely, exhaustive, and mutually exclusive outcomes. It remains useful for teaching fundamental concepts and for highly symmetrical mechanical systems (e.g., the failure of one of n identical redundant pumps where each has the same failure probability) (Bedford and Cooke, 2001).

    2. Subjective (Bayesian) Probability

    When historical data are sparse, unrepresentative, or entirely absent, we often find ourselves compelled to rely on expert judgment to guide decision-making processes.


    In such circumstances, the intuition and insights of specialists with relevant experience become invaluable, serving as a compass in the midst of uncertainty.


    Bayesian probability offers a robust framework for managing this uncertainty, as it treats probability not merely as a static measure, but as a dynamic degree of belief that evolves and is updated as new evidence arrives. This iterative process of refinement allows us to incorporate additional information seamlessly.


    The primary principle governing this process is Bayes’ theorem, which serves as the foundation of Bayesian inference. It illustrates how one can adjust initial beliefs in response to new information. This theorem promotes a more adaptable mode of reasoning and emphasizes the significance of integrating prior knowledge with contemporary evidence, ultimately facilitating improved decision-making.


    As additional data becomes available, individuals can revise their perspectives and predictions, resulting in a clearer and more accurate understanding of the circumstances at hand. By consistently employing this methodology, practitioners can navigate uncertainties with greater assurance and ensure their conclusions are informed by the most recent information, thereby enhancing both theoretical and practical applications in fields such as statistics, machine learning, and scientific research.


    Posterior probability ∝ likelihood × prior probability

    In odds form this becomes particularly intuitive for risk analysts:

    Posterior odds = prior odds × likelihood ratio

    Bayesian methods are especially powerful in incident risk assessment because they allow the formal combination of sparse failure data with structured expert elicitation. Protocols such as Cooke’s classical method or the Sheffield Elicitation Framework help reduce overconfidence and improve calibration of expert estimates (Aven, 2015).

    3. Empirical (Frequentist) Probability

    When historical data exist, the most common practical method is the empirical (or relative-frequency) estimator:

    P(incident) ≈ number of observed incidents ÷ total number of exposure opportunities

    “Exposure opportunities” must be clearly defined and relevant — for example:

    • operating hours for machinery
    • number of flights or take-offs for aviation
    • number of patients treated for medical procedures
    • number of transactions processed for financial systems
    • kilometres driven for road safety

    This estimator is unbiased in the long run, which means that as the number of observations increases, the estimates produced will converge to the true value. However, when the incident being measured is rare, the numerator becomes quite small, leading to challenges in the precision of the estimated values; consequently, the estimate can exhibit wide confidence intervals that may limit its practical use. Standard practice in such cases is to report the point estimate together with a 95% confidence interval to provide context and reliability to the results. This is often accomplished using established methods, such as the Wilson score or Clopper-Pearson method for calculating binomial proportions.


    Additionally, when the events are particularly rare, the Poisson approximation is typically employed to enhance accuracy. Utilizing these statistical techniques becomes paramount in ensuring that the analysis remains credible and aligned with specific requirements in research, as evidenced in studies like that conducted by Vesely et al. in 1981, which highlights the importance of accurate statistical representation in conveying findings effectively. (Vesely et al., 1981).

    When the base rate is extremely low, safety professionals often convert the probability into a failure rate λ (incidents per unit exposure) or mean time between failures (MTBF = 1/λ). For small probabilities, P(incident in time t) ≈ λ × t.

    (π) Exposure-based probability (normalise by opportunity)


    A raw count can mislead if activity levels change. Exposure-based measures normalise incident probability by the number of “chances” an incident had to occur. (Rausand, 2011)

    • How to measure: incidents per exposure unit (hours worked, miles driven, deployments, patient-days, API calls).
    • Example: “2 incidents per 1,000 deployments.”

    Best for: environments where volume fluctuates.

    Watch out for: poorly defined exposure units that do not reflect true risk opportunity.

    4. Fault Tree Analysis (FTA) – Deductive Quantitative Modelling

    Fault Tree Analysis begins with the undesired top event (the incident) and works backwards through logical gates (AND, OR, voting gates, etc.) to identify all combinations of basic events that can cause it. Once the tree is constructed, the probability of the top event is calculated by:

    • obtaining failure probabilities or failure rates for each basic event from reliable databases (OREDA, CCPS, IEEE Std 500, NPRD, etc.)
    • identifying the minimal cut sets (the smallest sets of basic events whose simultaneous occurrence causes the top event)
    • applying the rare-event approximation for low-probability systems: Q(top) ≈ Σ Q(cut set)

    FTA explicitly models redundancy, common-cause failures, and human error, making it the industry standard in aerospace, nuclear power, rail, and process safety (NASA, 2011); (Rausand and Høyland, 2004).

    5. Event Tree Analysis (ETA) – Inductive Forward Modelling

    Event Tree Analysis starts from an initiating event (e.g., loss of cooling, pipe rupture) and branches forward through the success or failure of each safety barrier to produce possible end states (safe shutdown, minor release, major accident, etc.). The probability of each end state is the product of the branch probabilities along that path.

    ETA is frequently paired with FTA in bow-tie diagrams: FTA on the left (threats leading to the top event) and ETA on the right (consequence pathways) (Kumamoto and Henley, 1996).

    6. Bow-Tie Analysis

    Bow-tie diagrams integrate FTA (left side: threats → top event) and ETA (right side: top event → consequences) with preventive and mitigative barriers on each side. Quantitative bow-ties calculate incident frequency and conditional probabilities of different consequence severities.

    7. Monte Carlo Simulation

    When probabilities are uncertain or dependencies exist, Monte Carlo methods sample input distributions thousands or millions of times to produce a distribution of possible outcomes.

    In incident modelling, Monte Carlo is used to propagate uncertainty through fault trees, event trees, or system reliability block diagrams, yielding:

    • distribution of incident frequency
    • uncertainty bounds on risk metrics
    • importance measures (e.g., Birnbaum, criticality) (Vose, 2008)

    8. Layer of Protection Analysis (LOPA)

    LOPA is a semi-quantitative method commonly used in process safety.

    It estimates the frequency of a consequence by multiplying:

    Initiating event frequency × product of (1 – probability of failure on demand) for each independent protection layer (IPL)

    LOPA bridges qualitative HAZOP and full QRA (CCPS, 2008).

    9. Human Reliability Analysis (HRA)

    Human errors contribute to many incidents. Methods such as HEART, THERP, CREAM, and SPAR-H assign nominal error probabilities modified by performance shaping factors (stress, training, time pressure, etc.).

    10. Predictive Models and Machine Learning

    Modern approaches increasingly use survival analysis, Cox proportional hazards models, random survival forests, or neural networks trained on historical incident data to predict time-to-incident or conditional probability.

    ∞. Confidence and uncertainty scoring (how sure are you?)

    Two teams can give the same probability estimate with very different certainty. Tracking confidence prevents false precision. (Aven, 2016)

    • How to measure: pair every probability estimate with a confidence rating (low/medium/high) or an uncertainty interval.
    • Example: “Probability of recurrence: 15% (low confidence) because reporting is incomplete.”

    Best for: decision-making under uncertainty.

    Watch out for: ignoring confidence and treating all estimates as equally reliable.

    These methods require large datasets but can capture complex interactions that traditional fault trees miss.

    Putting it all together: a simple, practical approach

    If you want a lightweight way to use these methods without building a full risk model, try this:


    1. Start with historical and exposure-based rates (Methods 1 to π).
    2. Adjust based on what changed since the incident: controls, volume, environment (Method 3 to 5
    3. Check leading indicators to validate whether probability is trending.
    4. Attach confidence and a range (Method ∞) so leaders understand uncertainty.

    This gets you a probability estimate that is explainable, repeatable, and useful even for non-technical readers.


    Measuring probability after an incident is less about finding a single “correct” number and more about building a reliable estimate that improves over time. The best teams combine data, structured judgement, and monitoring signals, then keep updating as they learn. (Aven, 2016)

    Conclusion

    Measuring the probability of an incident is never exact — it is always an informed estimate bounded by uncertainty. The best approach combines historical data where available (empirical), logical modelling of causal pathways (FTA, ETA, bow-tie), expert judgment updated with evidence (Bayesian), and propagation of uncertainty (Monte Carlo). Validation against real outcomes remains essential.

    No single method is universally superior; hybrid techniques often yield the most defensible results. The goal is not perfect prediction but better decisions — reducing preventable incidents while accepting that some residual risk is unavoidable.

    (Word count: 2,512)

    References

    Aven, T. (2015) Risk Analysis. 2nd edn. Wiley. Available at: https://onlinelibrary.wiley.com/doi/book/10.1002/9781119057802 (Accessed: 23 February 2026).

    Aven, T. (2016). Risk assessment and risk management: Review of recent advances on their foundation. European Journal of Operational Research.

    Bedford, T. and Cooke, R. (2001) Probabilistic Risk Analysis: Foundations and Methods. Cambridge University Press. Available at: https://www.cambridge.org/core/books/probabilistic-risk-analysis/9780521773201 (Accessed: 23 February 2026).

    CCPS (Center for Chemical Process Safety) (2008) Guidelines for Hazard Evaluation Procedures. 3rd edn. Wiley-AIChE. Available at: https://www.wiley.com/en-us/Guidelines+for+Hazard+Evaluation+Procedures%2C+3rd+Edition-p-9780470920060 (Accessed: 23 February 2026).

    Gelman, A., Carlin, J.B., Stern, H.S., Dunson, D.B., Vehtari, A. and Rubin, D.B. (2013). Bayesian Data Analysis (3rd ed.). Routledge.

    Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.

    Kroese, D.P., Taimre, T. and Botev, Z.I. (2014). Handbook of Monte Carlo Methods. Wiley.

    Kumamoto, H. and Henley, E.J. (1996) Probabilistic Risk Assessment and Management for Engineers and Scientists. 2nd edn. IEEE Press. Available at: https://ieeexplore.ieee.org/book/6267380 (Accessed: 23 February 2026).

    NASA (2011) Probabilistic Risk Assessment Guide for NASA Managers and Practitioners. NASA/SP-2011-3422. Available at: https://www.nasa.gov/sites/default/files/atoms/files/2011_prag_final_12-15-2011.pdf (Accessed: 23 February 2026).

    Rausand, M. and Høyland, A. (2004) System Reliability Theory: Models, Statistical Methods, and Applications. 2nd edn. Wiley. Available at: https://onlinelibrary.wiley.com/doi/book/10.1002/9780470316900 (Accessed: 23 February 2026).

    Rausand, M. (2011). Risk Assessment: Theory, Methods, and Applications. Wiley.

    Reason, J. (1997). Managing the Risks of Organizational Accidents. Ashgate.

    Vesely, W.E. et al. (1981) Fault Tree Handbook. U.S. Nuclear Regulatory Commission, NUREG-0492. Available at: https://www.nrc.gov/docs/ML1007/ML100780465.pdf (Accessed: 23 February 2026).

    Vose, D. (2008) Risk Analysis: A Quantitative Guide. 3rd edn. Wiley. Available at: https://www.wiley.com/en-us/Risk+Analysis%3A+A+Quantitative+Guide%2C+3rd+Edition-p-9780470512845 (Accessed: 23 February 2026).

    Weick, K.E. and Sutcliffe, K.M. (2015). Managing the Unexpected: Sustained Performance in a Complex World (3rd ed.). Wiley.

  • The Different Types of Hypothyroidism: An Informative Overview

    The Different Types of Hypothyroidism: An Informative Overview

    Advertisements

    1. Primary Hypothyroidism

    Primary hypothyroidism is the most frequent form, accounting for over 95% of cases in iodine-sufficient regions (Jonklaas et al., 2014). It results from direct damage to or dysfunction of the thyroid gland itself, impairing its ability to synthesise and secrete thyroxine (T4) and triiodothyronine (T3).

    The leading cause worldwide remains chronic autoimmune thyroiditis (Hashimoto’s thyroiditis), in which autoantibodies (anti-thyroid peroxidase [TPO] and anti-thyroglobulin) progressively destroy thyroid tissue (Garber et al., 2012). Other important aetiologies include:

    • Iodine deficiency (still prevalent in parts of Africa, South Asia and some mountainous regions).
    • Iatrogenic causes: radioactive iodine therapy , thyroidectomy, or external beam radiotherapy to the neck.
    • Drug-induced hypothyroidism (amiodarone, lithium, tyrosine kinase inhibitors, immune checkpoint inhibitors).
    • Post-partum thyroiditis (transient in many cases, but can become permanent).
    • Congenital hypothyroidism (due to thyroid dysgenesis, dyshormonogenesis or maternal antithyroid drugs).

    Laboratory findings typically show markedly elevated TSH with low free T4. Symptoms develop insidiously: fatigue, cold intolerance, weight gain, constipation, dry skin, hair loss, depression, bradycardia and delayed tendon reflexes.

    Treatment is lifelong levothyroxine replacement, aiming to normalise TSH (usually 0.4–4.0 mIU/L, though individual targets vary) (Jonklaas et al., 2014). Regular monitoring every 6–12 months is recommended once stable.

    2. Central (Secondary and Tertiary) Hypothyroidism

    Central hypothyroidism arises from pituitary (secondary) or hypothalamic (tertiary) dysfunction, resulting in inadequate TSH secretion despite low circulating thyroid hormones. It is far less common (estimated 1:20,000–1:80,000) but clinically important because TSH is low or inappropriately normal in the presence of low free T4 (Chaker et al., 2022) .

    Causes include:

    • Pituitary adenomas (most frequent).
    • Sheehan’s syndrome (post-partum pituitary necrosis).
    • Infiltrative diseases (sarcoidosis, haemochromatosis, Langerhans cell histiocytosis).
    • Traumatic brain injury.
    • Radiation to the sella turcica.
    • Congenital hypopituitarism.

    Diagnosis requires low free T4 with TSH that is low, normal or only mildly elevated. Free T3 may also be low. MRI of the pituitary is often indicated. Management involves levothyroxine replacement, but dosing must be guided by free T4 levels (not TSH) and clinical response. Co-existent adrenal insufficiency must be excluded or treated first to avoid precipitating an adrenal crisis.

    3. Subclinical Hypothyroidism

    Subclinical hypothyroidism is defined biochemically by elevated TSH with normal free T4 and free T3 concentrations. Prevalence increases with age, reaching 10–20% in people over 60 years. Most cases are mild (TSH 4.5–10 mIU/L) (Pearce et al., 2016).

    The decision to treat remains controversial and is guided by:

    • TSH level (>10 mIU/L is more likely to benefit from treatment).
    • Presence of symptoms.
    • Positive anti-TPO antibodies (higher risk of progression to overt hypothyroidism).
    • Cardiovascular risk factors.
    • Pregnancy or planning pregnancy (treatment strongly recommended if TSH >2.5–4.0 mIU/L depending on trimester) (Alexander et al., 2017).

    Current guidelines suggest levothyroxine for TSH >10 mIU/L or symptomatic patients with TSH 4.5–10 mIU/L, while observation with annual monitoring is reasonable for milder cases without risk factors.

    4. Transient and Drug-Induced Hypothyroidism

    Several situations cause temporary thyroid failure:

    • Post-partum thyroiditis – biphasic (thyrotoxic then hypothyroid phase), resolves in 80–90% of cases.
    • Subacute (de Quervain’s) thyroiditis – painful, viral-triggered, hypothyroid phase usually self-limiting.
    • Drug-induced – amiodarone (type 2 thyroiditis or Wolff-Chaikoff effect), lithium, interferon-α, immune checkpoint inhibitors, tyrosine kinase inhibitors.

    Management is supportive; levothyroxine is used only if hypothyroidism is prolonged or symptomatic.

    5. Congenital Hypothyroidism

    Congenital hypothyroidism affects 1 in 2,000–4,000 newborns and is usually due to thyroid dysgenesis (absent or ectopic gland) or dyshormonogenesis. Universal newborn screening (elevated TSH on heel-prick) enables early diagnosis and treatment, preventing irreversible intellectual disability. Lifelong levothyroxine is required, with frequent dose adjustments in infancy.

    Clinical and Practical Considerations

    Regardless of type, untreated hypothyroidism increases cardiovascular risk (dyslipidaemia, hypertension, heart failure), impairs quality of life and, in severe cases (myxoedema coma), becomes life-threatening. Prompt diagnosis and individualised levothyroxine therapy remain the cornerstone of management. Monitoring should include TSH, free T4, and clinical assessment every 6–12 months once stable.

    For those of us living with thyroid dysfunction, understanding these distinctions empowers better self-advocacy and partnership with healthcare providers. Knowledge truly is a form of healing.

    References

    Alexander, E. K. et al. (2017) 2017 Guidelines of the American Thyroid Association for the diagnosis and management of thyroid disease during pregnancy and the postpartum. Thyroid, 27(3), pp. 315–389.

    Chaker, L. et al. (2022) Hypothyroidism. The Lancet, 399(10333), pp. 1536–1552.

    Garber, J. R. et al. (2012) Clinical practice guidelines for hypothyroidism in adults: cosponsored by the American Association of Clinical Endocrinologists and the American Thyroid Association. Thyroid, 22(12), pp. 1200–1235.

    Jonklaas, J. et al. (2014) Guidelines for the treatment of hypothyroidism: prepared by the American Thyroid Association Task Force on Thyroid Hormone Replacement. Thyroid, 24(12), pp. 1670–1751.

    Pearce, S. H. S. et al. (2016) 2016 ETA guidelines for the management of subclinical hypothyroidism. European Thyroid Journal, 5(4), pp. 215–228.

  • Why “Vague Sustainability” is Starting to Look Really Suspicious

    Why “Vague Sustainability” is Starting to Look Really Suspicious

    Advertisements

    On top of that, though, some got quieter because they realised they didn’t actually have much to say. Some got quieter because, yeah, sure, it’s easier to stop talking than it is to keep improving. There are plenty of brands like this; most of the luxury fashion brands are especially guilty of this, like Chanel. But with all of that said here, there’s a difference between being careful and being vague. Now, you better believe that customers can tell the difference. 

    And honestly, being vague is starting to feel like a red flag. Well, it’s been a red flag, but it’s even bigger now. 

    Being Quiet isn’t Automatically “Humble” 

    Yeah, it’s as plain and as simple as this, honestly. But sure, this is where it gets a little spicy, at the same time, though, because some brands act like silence is this noble move now. Like, “oh, it’s better not talk about it,” and sure, sometimes that’s true if a business is still figuring things out and doesn’t want to overpromise. 

    But if a business is selling itself as sustainable, and there’s no details anywhere, that’s not humility, that’s just confusing. Think about it here; customers don’t want a scavenger hunt. They don’t want to dig through five pages, a PDF, and a vague Instagram caption just to find out if a company’s claims are real. Oh, and of course, some companies don’t even provide a scavenger hunt; they’ll say they’re active, but there’s literally no proof in any of it.

    Now, it makes absolute total sense, though that customers have gotten more sceptical for a reason. Like too many businesses used sustainability as a marketing costume. So now, when a company is vague, people don’t assume it’s being responsible; they assume it’s hiding something. That’s the reality.

    It’s Better to be Transparent than Perfectly Sustainable

    Well, sure, you should still try and do what you can to be sustainable here, but don’t think it has to be perfection or anything like that. Actually, a lot of small businesses freeze up because they think they need to be perfect before saying anything. Like, if the business can’t claim zero waste or carbon neutral or whatever the big claim is, then it can’t talk about sustainability at all.

    But is that all true? Nope, no, not at all. It also sets up a weird dynamic where only huge corporations with big budgets get to “talk sustainability,” while smaller businesses that are actually trying to stay silent. But transparency can be simple. It can be, here’s what’s being done now, here’s what’s still being improved, and here’s what customers can expect. 

    That kind of honesty is trustworthy because it’s normal. It sounds like a human business, not a marketing machine.

    It Wouldn’t Hurt to Audit Competitors

    And what exactly would be the reason to do this, though? Just think about it; if competitors are vague, that’s an opportunity. If competitors are making big claims without proof, that’s an opportunity. If competitors have confusing policies or unclear pricing, that’s an opportunity too. Some businesses even use industry tools to see how others communicate offers and policies, especially in operational niches. 

    Like, a company in the waste space might look at a waste hauler competitor app to understand how other operators present service options and customer communication, then use that insight to create a clearer, more transparent experience. It just helps to spot the gaps they have, so you can fill the gaps for your business. 

    Customers aren’t Just Buying a Product 

    And of course, This is what a lot of businesses forget. But sustainability messaging isn’t only about the planet. But it’s also about competence. When a company clearly explains what it does and why, it feels organised. It feels accountable. Well, overall here, it feels like it has standards.

    And of course, that matters because customers are constantly making quick trust decisions. Is this business legit? Is it consistent? Is it going to follow through? Is it going to surprise someone with hidden fees, messy policies, or vague claims? Lots of questions here, but the transparency is supposed to answer all of those questions; everything is supposed to be clear right from the get-go. Again, there shouldn’t be some scavenger hunt going on.

    It’s Easier to Compete without Racing to the Bottom

    Competing was already mentioned, well, in terms of audits and finding gaps, but that’s not the other thing to keep in mind here, though. So, pricing competition is exhausting. You probably already know that here. But competing on “cheapest” usually turns into lower margins, rushed work, and customers who treat the business like it’s interchangeable. Now, clearly, that’s not a sustainable business model, and yeah, that word is doing double duty there.

    But go ahead and think about this: transparency gives a business another lane to compete in. It gives a business a way to justify pricing, explain value, and build loyalty with customers who care about responsible practices. And even customers who don’t care deeply about sustainability still like the idea of less waste, fewer problems, and a business that’s honest.

    Again, as was mentioned, it helps when competitors are vague. If other businesses are hard to compare because they hide details, then a transparent business stands out. It feels easier to choose. Usually, customers can see what they’re paying for. And again, they don’t like scavenger hunts, and it’s pretty easy to fill in the gaps with how your competitors are messing up.

  • SEO Best Practices: Maximise Your Website’s Performance

    SEO Best Practices: Maximise Your Website’s Performance

    Advertisements

    SEO: Claiming Your Digital Real Estate

    Every serious website owner dreams of the same thing: when someone types their name or business into a search engine, their site appears on page one—ideally in position one. Achieving this is neither impossible nor effortless; it requires consistent, ethical practices that align with how modern search engines evaluate quality.

    Search engines assign “rank” based on a combination of signals: GDPR compliance (demonstrating trustworthiness and data protection), well-structured SEO metadata (title tags, meta descriptions, schema markup), lightning-fast loading performance (especially Core Web Vitals), mobile-friendliness, and—most importantly—the genuine value and originality of the content (Google, 2025 ).

    Security: Safeguarding Your Digital Sanctuary

    A beautiful website is worthless if it can be compromised. It’s important to keep your website safe from web trolls, spammers, malicious bots, hackers, zero-day exploits are daily realities, and other types of cyber-insecurities. Equally important is the ability to recover gracefully from mistakes or malicious changes, to restore the website to a point; and to be able to scan potential vulnerabilities. For someone like me, who works alone and cannot afford enterprise-level security teams, having this level of protection is priceless.

    Social Presence: Amplifying Reach Without Losing Control

    Being present on social media builds prestige, and trust. A strong social presence still correlates with indirect SEO benefits, particularly through brand searches and referral traffic. There is much to be gained from sharing new posts to multiple platforms—Facebook, LinkedIn, Tumblr, and more—ensuring consistent visibility without manual reposting.

    AI: Boosting Productivity Without Sacrificing Authenticity

    For me, this has meant publishing more frequently without burnout. The AI never replaces my lived experience or forensic insights; it simply removes friction, allowing me to focus on what matters most: authentic connection with my readers.

    Analytics: Data-Driven Decisions

    One way of assessing how your site is doing is by checking the statistics. These should show you the number of daily, weekly, and monthly visitors; as well as what content is trending, the geolocation of your readers, and the websites referring users to your site. Simply put, you cannot improve what you do not measure. Insights guide my content strategy. For instance, when I noticed a spike in mobile traffic from the United States searching for “lived experience narcissism,” I created a dedicated series—traffic and engagement soared. Real-time data turns guesswork into precision.

    A bar graph depicting website visitor statistics over time, with fluctuating values from December 22 to January 12, showing a notable peak on January 9.

    CRM: Building Meaningful Relationships

    If you offer consultations, digital products, or community engagement (as I do), basic Customer Relationship Management functionality becomes essential. For someone running a solo advocacy and consulting site, these tools keep interactions personal yet manageable—vital when executive function challenges make organisation effortful.

    🫂 Disclosure: This post contains affiliate links, and is sponsored by Automattic, Inc. Any purchases you make will help to maintain this website through earned commissions.

    Introducing Jetpack: A WordPress.com Plugin

    Jetpack is a native WordPress.com plugin by Automattic, Inc which has all of the above mentioned components within it. It provides real-time back-ups, stats, security scans, artificial intelligence, CRM, and it allows you to distribute your blog posts around different social media websites. Furthermore, Jetpack also helps you to monetise from your project, and helps to improve performance.

    The plugin contributes directly to several SEO pillars. It optimises image delivery through its Content Delivery Network (CDN) among other features, it measurably improves Largest Contentful Paint (LCP) and Cumulative Layout Shift (CLS)—two critical Core Web Vitals metrics that directly influence rankings (Google PageSpeed Insights, 2025). What’s more, Jetpack’s AI Assistant handles many of those repetitive tasks intelligently—proposing SEO-friendly headings—while leaving the final voice and tone entirely in your hands (Automattic, 2025b).

    In my own experience, after implementing Jetpack Boost’s performance enhancements, my website’s core web vitals improved. That kind of progress is not accidental; it is engineered through thoughtful tooling.

    💡 If you haven’t got a website yet, make sure to read my post:
    Decentralisation: Why You Should Have a Website by 2030

    Graphic promoting the WordPress blogging platform with a call to action to start a blog and join a community of millions.

    Furthermore, Jetpack Security provides real-time backups (with one-click restore), automated malware scanning, brute-force attack blocking, downtime monitoring, and activity logging. These features have saved me more than once when experimental theme edits went wrong—I simply rolled back to a previous version in seconds. The plugin also includes a Web Application Firewall (WAF) that blocks known malicious IPs before they reach your server, significantly reducing server load and potential vulnerabilities (Automattic, 2025a).

    Also, I have found that scheduling shares via Jetpack Social keeps my social channels active even on low-energy days, helping maintain momentum while I focus on core content creation. The result? Steady referral traffic and higher branded search volume—classic signs of growing domain authority. I can monitor this through Jetpack Stats, which provide clean, privacy-respecting analytics: daily/weekly/monthly visitors, top-performing posts, geographic distribution of readers, referral sources, search queries driving traffic, and device breakdowns—all visible in a simple dashboard or via the mobile app.

    Finally, Jetpack includes GDPR consent compliance, contact forms with spam protection, and newsletter subscription tools. It also supports multiple revenue paths: WordAds integration for display advertising, direct payment buttons via Stripe or PayPal, and full WooCommerce compatibility for digital shops.

    Promotional banner encouraging users to start their blog on WordPress.com, featuring a collection of blog themes and a call-to-action button.

    Conclusion

    In a world of algorithm dependency and platform fragility, having a modern system empowers website owners to reclaim control. It combines SEO strength, ironclad security, effortless social distribution, intelligent AI assistance, actionable analytics, relationship tools, and monetisation options—all within one cohesive ecosystem.

    For me, Jetpack is not just a plugin; it is the backbone of Betshy.com’s longevity. It allows me to focus on creating meaningful content—sharing forensic insights, lived experience, and hope—while the technical heavy lifting is handled reliably. Whether you are a blogger, advocate, consultant, or dreamer building your empire, WordPress.com gives you the tools to grow sustainably, securely, and authentically.

    References

    Automattic (2025a) Jetpack Security: Protect your site. Available at: https://jetpack.com/features/security/ (Accessed: 12 January 2026).

    Automattic (2025b) Jetpack AI Assistant. Available at: https://jetpack.com/ai/ (Accessed: 12 January 2026).

    Google (2025) Core Web Vitals. Available at: https://developers.google.com/search/docs/appearance/core-web-vitals (Accessed: 12 January 2026).

    Google PageSpeed Insights (2025) PageSpeed Insights documentation. Available at: https://developers.google.com/speed/docs/insights/v5/about (Accessed: 12 January 2026).