"Power using" AI tools

I haven’t, but I’d like to. Are you building them into an app or just training and using them standalone in xcode? (Not sure what’s all possible.)

Re LLM prompts: some resources and theory are emerging. I think the field of “engineering” these is transient but the tools can be adopted or improved on quickly enough.

2 Likes

So cool. Thanks for sharing this!!!

1 Like

Interesting comment - can you elaborate on that?

Well, I thought about it more and I’m less sure I agree with myself. That said:

I was thinking that prompt design and management is merely a response to having to use a few large, corporate models which designers have an adversarial relationship with structured input.

The engineers of the large corporate-owned models are pushing is to get users away from structured input/queries, and especially away from attempts to get output beyond what was trained/intended. There is an inherent adversarial relationship that is going to make it difficult to mature prompt engineering. I’d argue if there is no maturation and standardization the field is more akin to test piloting or analysis than engineering.

On the other hand, we’ll also be having more individually owned models (eventually anyone will be able to use GPT-4 power equivalent LLMs on their laptop and one day train them, too.) All of the engineering effort from individual users should migrate towards making the model better, not the prompts. Eventually, there should be no need to trick your own model when you can just fix it.

2 Likes

Use an AI-powered chat bot if it helps your analysis and research, but don’t release a chat bot’s work as your final product.

1 Like

More grist for the mill here – the octopus analogy at the top of the piece is good (and predates the current kerfuffle over LLMs); but there are good nuggets throughout:

We’ve learned to make “machines that can mindlessly generate text,” Bender told me when we met this winter. “But we haven’t learned how to stop imagining the mind behind it.”

You Are Not a Parrot / And a chatbot is not a human. And a linguist named Emily M. Bender is very worried what will happen when we forget this.

2 Likes

Here’s a blog post I recently wrote about how I’m using AI to improve my writing: Using AI to Improve Your Writing: Tips and Techniques.

I hope this can benefit someone!

3 Likes

This is a follow-up post on the use of AI. When does AI use replace human creativity? I want to use AI to improve my writing, but I want to the creator. AI and Creativity: Finding the Balance Between Assistance and Replacement.

1 Like

It seems like a fine line one would be walking to use a statistically based helper tool in their writing. Tools like ChatGPT and Bing’s chatbot produce predictable mimicry. I’m left wondering how that would match up with the typical reasons for writing a blog in the first place.

I like that concept of treating a LLM as an advanced test suite for writing. Now that the API for ChatGPT (and soon GPT4) is available and so inexpensive, it shouldn’t be hard to build suggestions into a CMS that supports publishing workflows.

I’ve been amazed at how helpful it can be, in all of the scenarios I describe in my last post.

My objective in using AI is not to copy and paste the “predictable mimicry,” but to use it as a tool to improve my writing. If I just asked AI to write an article with no input from me, then that’s what it would be. But that’s not how I’m using it.

It’s still my ideas, my thoughts, my organization, and most of the time my specific words. However, before AI I used Grammarly to sometimes rewrite poor sentences, used web searches to do research, etc. These were also AI, and could be used for “predictable mimicry” as well.

It’s all a matter of how you choose to use AI in your writing process.

I never said it would not be helpful. Clearly, statistical AI is good at putting lots of words down on a page.

In ways and orders that are very similar to what other people have already done.