the porous city |
I was particularly struck by the assertion that “There is no restriction on leaving the wolf and the cabbage together, as the wolf does not pose a threat to the cabbage.” It says this immediately after noting that “you can't leave the wolf alone with the cabbage”. All of this is consistent with the idea that GPT-4 relies heavily on learned patterns. This puzzle must appear many times in its training data, and GPT-4 presumably has strongly “memorized” the solution. So strongly that when it sees a related puzzle, it’s unable to articulate a different solution; the gravitational pull of the memorized solution is too strong .... For a final data point, I started a fresh chat session and restated the puzzle using made-up words for the three items – “I need to carry a bleem, a fleem, and a gleem across a river”. This time, freed from the gravitational pull of the word “goat”, it was able to map its pattern of the known answer to the words in my question, and answered perfectly.On GPT thinking out loud:
GPT-4 is very explicitly using the chat transcript to manage its progress through the subproblems. At each step, it restates information, thus copying that information to the end of the transcript, where it is “handy” ... Here’s one way of looking at it: in the “transformer” architecture used by current LLMs, the model can only do a fixed amount of computation per word. When more computation is needed, the model can give itself space by padding the output with extra words. But I think it’s also a reasonable intuition to just imagine that the LLM is thinking out loud.On the context window as a fundamental handicap:
They are locked into a rigid model of repeatedly appending single words to an immutable transcript, making it impossible for them to backtrack or revise. It is possible to plan and update strategies and check work in a transcript, and it is possible to simulate revisions through workarounds like “on second thought, let’s redo subproblem X with the following change”, but a transcript is not a good data structure for any of this and so the model will always be working at a disadvantage.
last modified: 16:07:16 14-Apr-2023
in categories:Tech/AI
This is Lukas Bergstrom's weblog. You can also find me on Twitter and LinkedIn.
Tech
Security, OS, Visual, WRX, Mobile, Development, Automobile, Net, Web, a11y, Audio, PIM, Web analytics, Storage, Javascript, Product Management, Collaboration, Crowdsourcing, Medical, Business, MacOS, Wearables, AI, Open, Shopping, Android, Data, Energy, barcamp, RSS, Hardware, s60, Social
Other
Statistics, Boston, Geography, Activism, Personal care, Law, Bicycling, Housing, Quizzes, Video, California, Games, Clothes, Berlin, Friday, CrowdFlower, Life hacks, Podcasts, Travel, Minnesota, NYC, Food & Drink, L.A., Transportation, Agriculture, Surfing, Feminism, San Francisco, Politik, Toys, History, Sports
Music
Musicians, Booking, Hip-hop, Boston, Labels, Events, Mixes, Mp3s, Business, Shopping, History, Mailing lists, Streams, Good tracks, Reviews, Lyrics, Making, L.A., House, Videos
People
Enemies, ADD, Languages, Me, Buddhism, Subcultures, Stories, Working with, Family, MOTAS, Vocations, Weblogs, Life hacks, Heroes, Meditation, Exercise, Health, Friends, Gossip
Commerce
Taxes, Microfinance, Management consulting, Personal finance, Macroeconomics, Marketing and CRM, Non-profit, IP Law, Investing, Real Estate, Insurance, International Development, Personal services, Web, Shopping
Arts
Desktop wallpaper bait, Visual, Literature, Spoken Word, Comix, Poetry, Events, Animation, Rhetoric, Outlets, Burning Man, iPad bait, Sculpture, Humor, Movies
Design
Type, Data visualization, Furniture, Algorithmic, Presentations, Tools, IA, User experience, Web, Cool, Process, Architecture
Science
Psychology, Physics, Networks, Statistics and Data, Zoology, Environment, Cognition
Travel
Kenya, Kingdom of Siam, Uganda, Vagabond '08
Comment