Mayo Clinic AI: Imaging, Genomics, Memory
Key Points
- Mayo Clinic announced two AI initiatives: an automated radiology workflow that generates reports, assists with tube/line placement, and detects changes in chest X‑rays, moving from anecdotal success to a production system.
- In partnership with Azure, Mayo is creating a reference human‑genome dataset by combining its exome data with large‑scale genome data, aiming to use AI‑driven models to accelerate personalized‑medicine analysis.
- Chinese AI developer Huo introduced the Minimax model with a 4‑million‑token context window and “perfect recall,” marking a new generation of LLMs capable of handling extremely long inputs.
- OpenAI’s ChatGPT beta now offers an “Advanced memory” feature, further enhancing the model’s ability to retain and recall information across interactions.
- These developments highlight a broader trend toward extending context windows and improving memory in large language models, enabling more sophisticated and high‑stakes applications such as medical imaging and genomics.
Sections
Full Transcript
# Mayo Clinic AI: Imaging, Genomics, Memory **Source:** [https://www.youtube.com/watch?v=ElQ7deX014I](https://www.youtube.com/watch?v=ElQ7deX014I) **Duration:** 00:05:17 ## Summary - Mayo Clinic announced two AI initiatives: an automated radiology workflow that generates reports, assists with tube/line placement, and detects changes in chest X‑rays, moving from anecdotal success to a production system. - In partnership with Azure, Mayo is creating a reference human‑genome dataset by combining its exome data with large‑scale genome data, aiming to use AI‑driven models to accelerate personalized‑medicine analysis. - Chinese AI developer Huo introduced the Minimax model with a 4‑million‑token context window and “perfect recall,” marking a new generation of LLMs capable of handling extremely long inputs. - OpenAI’s ChatGPT beta now offers an “Advanced memory” feature, further enhancing the model’s ability to retain and recall information across interactions. - These developments highlight a broader trend toward extending context windows and improving memory in large language models, enabling more sophisticated and high‑stakes applications such as medical imaging and genomics. ## Sections - [00:00:00](https://www.youtube.com/watch?v=ElQ7deX014I&t=0s) **Untitled Section** - ## Full Transcript
AI news today we've got a bunch of fun
ones for you it's just never old uh I
like to talk about actual applications
of AI and I think it's really
interesting to see them in a medical
setting because those are really high
stakes you have to get them right Mayo
Clinic has published two different use
cases for AI recently that I wanted to
call out one is developing an AI model
for automated Radiology workflows which
I'm not super surprised by because
there's been a lot of anecdotal
reporting of how good large language
models are at reading x-ray images and
so what they've done is they've used it
to uh build report generation into the
X-ray workflow they're using it for tube
and line placement for evaluation which
is super interesting um and they're
looking at change detection in chest
x-rays which fits right in with the
anecdotes but there's a difference here
right like it's one thing to have like
some bro on X say you know I got this
x-ray looked at and look at what grock
did or look at what Chad GPT did it's
another to have Mayo clinics say this is
good enough that we're actually going to
build it in and so I think that's a
moment the other thing they're working
on is they're working with Azure on
combining Human Genome data with Mayo's
exome data sets and they're basically
looking at how can they build a
reference sort of perfect Human Genome
and I know that like that sounds like
the start of a sci-fi movie so we're
just going to pass over that part um but
they're looking to build build
essentially a reference data set around
the human genome with Azure that they
can then use when they look at genome
variance to build personalized medicine
that when the details are somewhat
sketchy on I could not tell you exactly
how they are using AI to scale
personalization except that in general
the compute for personalization has been
really difficult to do and AI is often
able to
figure
out personalized relationships faster
because a lot of what underneath the
hood is going on as Transformers are are
running these comparisons between tokens
and building token
relationships might be what's happening
that's speculation uh I'll be curious to
learn more about what they're doing with
their genomic Foundation
model number three huo is doing uh a
model huo is a Chinese model maker
they're they introduced their Minimax
model it's a 4 million token context
window so it's quite large and it has
perfect recall this is now the second
long context window perfect recall
feature that I have seen in the last two
days and there's a third one that I'm
going to mention now which is the Chad
GPT in beta for some users is now
rolling out Advanced um how do I want to
put it Advanced memory
it's not clear if Advanced memory means
it remembers better or if it remembers
more but it's happening and so to me
like when I look at these different
bullet points the Hua Minimax model that
came out the work that's been done on
Titans at Google and now this release by
Chad
GPT I think that we're seeing the dotted
lines toward one of the themes for 2025
which is solving the memory problem um I
would expect a lot more releases along
those lines in the next couple of
months number four uh you probably know
this uh but I think it's worth calling
out because it's another actual launch
at scale Reddit has gotten llm search to
be good
enough that they feel good about
launching Reddit answers which is an AI
powered Search tool it provides curated
conversational insights from reddit's
discussions what's interesting to me
is
Reddit chose to take their time with
this one like they could have been like
Google Google rushed their uh AI
summaries to Market they got panned for
it Reddit took their time and it makes
me wonder if the Reddit AI answers is
actually going to be a higher quality
experience for users versus Google
because they took the time on the
quality so I'd be curious for your
thoughts there last but not least we
have what I would call the rumor mill um
so there's three things that are being
rumored to be produced by open a or
shipped by openai by by January 30th one
is the 03 model which they've announced
but not released one is the ever elusive
project oion no one really knows what
that is some people thinks that think
that's GPT 5 which would be a new
pre-trained class of language model
bigger than
gp4 uh but that's speculation at this
point um and then the third thing that
people think they're going to release is
something farther on agents so not just
scheduled tasks but um an entire class
of Agents that we think they will call
operators it's speculation we're not
sure of the date it's rumor that's why I
stuck it at the end but it's worth
keeping an eye on as we get close to the
end of January all right that's what we
got for AI news let me know what you
think cheers