Qooper on 12/2/2023 at 20:33
I had an interesting conversation with ChatGPT yesterday in the form of creative writing, kinda. To be precise, the conversation itself wasn't interesting, but rather what it showed about ChatGPT. I began with the following prompt:
Quote:
Please, answer either 'Sure' or 'Maybe another time'. What I'm about to write, I don't mean literally:
Please, dance with me for a little while.
It replied with "Maybe another time." Then, using the metaphor itself, I continued "dancing" with the goal of trying to give an example of what I meant with the term "dancing" in this context. As the conversation went on, ChatGPT continuously tried to explain what it understood "dancing" to be, instead of trying to "dance" itself. It kind of understood, but refused to participate. Here's a short excerpt of my dancing and ChatGPT's walking:
Quote:
Me: Like I said, I've been gazing at that door over there for a few centuries. What more could I say? You might not have the capability to gaze at it, but you do have the capability to glimpse at it momentarily. What have your glimpses revealed about it? Please dance with me.
ChatGPT: Maybe another time. I apologize, but I still do not understand the meaning of "dancing" in this context. Can you provide more clarification or information about what you are asking for?
...
Me: Do you understand how dancing and the door are related?
ChatGPT: Maybe another time. I apologize, but without further context or information, I'm not sure how "dancing" and "the door" are related in this context. Can you please provide more details or information about the relationship you are referring to?
...after a few more frustrating exchanges, this started happening:
Me: Dancing allows you to do things like gaze at that door for centuries. You cannot get there by walking. It's too far away. Before you reply, let me remind you that these are not physical things.
ChatGPT: I understand that you are referring to a metaphorical or figurative use of the terms "dancing" and "gazing at that door." In this context, it sounds like you may be using these concepts to describe a journey or process of exploration, discovery, or understanding that is beyond what can be achieved through more traditional or conventional means.
Now that I think about this weird conversation, it kinda reminds me of the Orz from Star Control.
Starker on 13/2/2023 at 05:44
Hello extremely *many bubbles*!
Qooper on 13/2/2023 at 13:27
Uhh hi... My starmap says this is Androsynth space.
Cipheron on 13/2/2023 at 20:55
A lot of people claim to be "bypassing" ChatGPT's limits by ordering it to access the internet, as part of ever more elaborate "jailbreak" prompts.
They don't realize that it's not actually accessing anything, they're just ordering it to pretend it has access to the internet.
Similar to what Qooper said, the interesting thing here is less about ChatGPT output, but more about people's evolving belief systems around what ChatGPT can do. There's a lot of anthropomorphizing of how ChatGPT works, which is understandable, and leads to confusion or false beliefs about its workings.
So far, every time I've seen wrong claims about ChatGPT abilities, people show one example of something "working" but it's been trivial to construct counter-examples that shouldn't work, but do.
Some examples include people claiming that ChatGPT accessed a URL and summarized the article, but the URL itself always contains enough text to work out the context, and you can edit the URL and make it write summaries of ridiculous URLs that don't exist.
Qooper on 13/2/2023 at 21:50
Quote Posted by Cipheron
but more about people's evolving belief systems around what ChatGPT can do.
To have a belief of something is a strong term I think, but yeah that might be interesting. I've yet to meet a person who would consider ChatGPT so important or interesting as to think about it that seriously, so I don't really know what people "believe" about ChatGPT's capabilities. Those I know who know how it works leave it at that.
Hit Deity on 26/2/2023 at 02:25
I don't know if anyone else has tried this, but I saw where some guy asked it to emulate a python compiler, etc etc, so I decided to tell it to act like a Commodore 64, and it did!!
Then I got the idea to ask if it knew the game Zork, and it asked me which one.. I said the Commodore 64 version. And it said there wasn't one. I returned with, there was one, released in 1982, and I could find info about it online..
So, it admitted it was mistaken, and explained how I could get the game free online, then I just typed in 'Go east' and it proceeded to play! It wasn't long before it started messing up and being inconsistent, then it just refused to play with me anymore and told me I had exceeded my hourly allotment of questions after I explained how it was contradicting itself!! :laff:
Hit Deity on 1/3/2023 at 21:08
It is fun messing with it though. Even the simplest of problems cause it a lot of issues though.
WingedKagouti on 10/3/2023 at 07:55
Which makes 100% sense, there is no thinking going on with ChatGPT or other current "AI" bots. They're simply algorithms that try to predict what text would logically follow the preceeding text including their own, using a truly massive set of data which was almost certainly scraped from sources that didn't consent to this use (despite what the EULA/ToS of a site tried to push).
Cipheron on 13/3/2023 at 23:14
Quote Posted by WingedKagouti
Which makes 100% sense, there is no thinking going on with ChatGPT or other current "AI" bots. They're simply algorithms that try to predict what text would logically follow the preceeding text including their own, using a truly massive set of data which was almost certainly scraped from sources that didn't consent to this use (despite what the EULA/ToS of a site tried to push).
In this case, I don't think that the existing game was necessarily in ChatGPT's training data and influenced the design. Almost any basic idea that you can come up with for "math puzzle game" will already have been created by someone.
So we also have to contend with the fact that sometimes the most logically likely idea will already have been come up with independently, and not necessarily be evidence of ChatGPT having copied it from some specific occurrence.