Yes Adrian some of us have flagged up that use of trackers should be considered and there are symptom tracking apps like visible (which already incorporates the Norwegian FUNCAP questionnaire as a monthly review)
unfortunately these comments have fallen on deaf ears and 20th century models are presented as gold standard
My point though is more that the use of trackers. AI really enables very different interfaces and ways to interact so its not just say an accelorometer or a form on a phone. Someone could speak/type a comment into their phone of what is on their mind about symptoms and health this can now be transcribed, interpreted, classified and advice given - using LLMs (Transformer networks).
I'm starting to think much more radical recording and interaction approaches can be enabled now.