1. Sign our petition calling on Cochrane to withdraw their review of Exercise Therapy for CFS here.
    Dismiss Notice
  2. Guest, the 'News in Brief' for the week beginning 15th April 2024 is here.
    Dismiss Notice
  3. Welcome! To read the Core Purpose and Values of our forum, click here.
    Dismiss Notice

BBC2, 9pm 1 Nov: Diagnosis on Demand? The Computer Will See You Now

Discussion in 'Other health news and research' started by Sasha, Nov 1, 2018.

  1. Sasha

    Sasha Senior Member (Voting Rights)

    Messages:
    3,780
    Location:
    UK
    Sounds very interesting:

    Read the whole thing at:

    https://www.bbc.co.uk/programmes/b0bqjq0q

    I wonder if part of the reason that PWME have difficulty getting diagnosed, let alone treated, is that doctors don't have the time or the tests to deal with us. Perhaps AI plus better testing tech will be part of the answer in the future...
     
    andypants, Inara and Webdog like this.
  2. wdb

    wdb Senior Member (Voting Rights)

    Messages:
    320
    Location:
    UK
    Exciting stuff but but have to say I'd be a little concerned over who owns the AI and how much control they will have over it I mean will the AI be trained to best serve the needs of the patients or to best serve the balance sheet of the medical industry.
     
    Samuel, andypants, Sly Saint and 9 others like this.
  3. Arnie Pye

    Arnie Pye Senior Member (Voting Rights)

    Messages:
    6,095
    Location:
    UK
    Any half-way competent programmer could programme for both situations, and after all testing was done they could just change a programmed setting and everything could be based on the bottom-line and profits.
     
    andypants likes this.
  4. WillowJ

    WillowJ Senior Member (Voting Rights)

    Messages:
    676
    In the future, hopefully that will help, provided the right people are directing things, as has been said.

    At the moment, it’s being found that AI is magnifying human bias (even unintentional bias), from for example the data sets used for training.


    https://www.technologyreview.com/s/608986/forget-killer-robotsbias-is-the-real-ai-danger/
    https://www.ibm.com/blogs/policy/bias-in-ai/
    https://www.theguardian.com/technol...bit-racist-and-sexist-biases-research-reveals
    But it looks like AI folk are trying to fix this.

    https://www.pbs.org/wgbh/nova/article/ai-bias/
    Doesn’t look from this article or anything else I read, like they’re doing much to include any marginalized populations, however (chronically ill, disabled, or otherwise).
     
    andypants, Joel and Inara like this.
  5. Alvin

    Alvin Senior Member (Voting Rights)

    Messages:
    3,309
    AI is not the panacea most hope it will be.
    Its a computer that is designed to compare A to B and spit out an answer. Its only as good as the programmer who designed it and has no common sense or intelligence. That said i know many doctors who have no sense either but that doesn't make a computer a diagnosing deity.
     
    andypants and Inara like this.
  6. Sly Saint

    Sly Saint Senior Member (Voting Rights)

    Messages:
    9,584
    Location:
    UK
  7. Joel

    Joel Senior Member (Voting Rights)

    Messages:
    941
    Location:
    UK
    As a patient, understanding how the AI functions; how it processes its inputs to arrive at its outputs, will be important to avoid mistreatment as it'll be programmed to assess whether something is in the patient's head despite the claims of the patient - all the failures of modern medicine will be programmed into it. so really not that dissimilar to the current situation with idiot doctors. It might actually be worse though as it'll be harder because there will be less tell tale signs, such as facial expressions, mannerisms etc. for patients to judge where AI is in its processing of the information you give it.
     
    wdb, Arnie Pye and andypants like this.

Share This Page