Let's say this way, there is article, but no good reasoning in that article. It is talking about intuition without even defining it. And it talks zero of the fundamental principle of existence which is "survive". Maybe you could research what is the goal of mind: https://www.dianetics.org/videos/audio-book-excerpts/the-goal-of-man.html
Instinctive knowing is native to living beings only.
The new anthropomorphism of intuition given to computer doesn't make it so, just by writing an article.
Human mind wants to survive, and not only for oneself, but to survive as family, as group, as mankind, as all living beings, as planet. Some people are aware that planet must survive, like Musk, so he builds rockets for Mars, while other people can't understand why. Though the better survival level we seek, the better we do over long term.
Computer doesn't want to survive, it is tool like a hammer. It has no intuition, it has no survival, thus has no instincts.
You can of course try to build data and try to ask computer to act upon such data, which in general, majority of models already do. They are giving probabilistic computations, but know nothing about it. Intuition is human and description of it has been already built in into the LLMs. If you wish to improve it, you are welcome.
However, I don't see anything revolutionary here.
LLM is reflection or mimicry of human knowledge.
If you give it some operational capacities such as to move around, to target people in the war, to control the house and business, it is going to do by the data it has been given, and it will do disaster randomly, just as it gives random nonsensical results from time to time.