Google NotebookLM--Another tool for note-taking

Absolutely, which is why we are adding one or more AI courses to our STEM program; we are not sure what that looks like, but prompt engineering and “big data analysis” may be components of the program. We are also exploring to what extent AI can help us identify school-wide academic performance trends and individual academic strengths and weaknesses to help us better customize instruction to individual student needs.

1 Like

This is a high school? Sounds more sophisticated than many colleges. Wow how I would have enjoyed such a school if it existed in my day.

Where is the school? What is the tution?

We are blessed to be able to provide an education that is thoroughly Christian and academically excellent, though there is always room for improvement, beginning with the Head of School. :slightly_smiling_face:

It is a 7-12 school in Town & Country, MO. Tuition is 21,000/year. We provide 33% of the student body with financial assistance totaling $3.2M. Most students’ tuition is offset by the college scholarships our students are awarded. Consequently, parents consider the school an investment in their children’s spiritual and academic development and a good ROI considering the college scholarships (academic and athletic) awarded.

1 Like

I would not leave my notes to Google. It is sad that they have neglected the space even in their Enterprise G Suite offering, (i.e. Google Keep is too basic). But one can wait a little bit and see what Microsoft does with One Note, that’s going to be where the real stuff is.

Edit: to say that I was not intended to quote reply to you specifically (Fat fingers!) but my point remains: you didn’t mention Google :slight_smile:

I probably won’t. It is challenging to convert Google Docs to various other formats. I prefer to use Obsidian, DEVONthink, or other more versatile applications. That said, I’m primarily interested in effective, reliable, and trustworthy integration of AI in my everyday tools—writing apps, task managers, calendars, and the like. :slightly_smiling_face:

At work, we use Google Workspace for everything. I enjoy using it. But for my private data I don’t trust them. The problem is that they kill services all the time. And this one is labelled as experimental.

I started using Gmail in 2004 and it stayed in beta for five years. :grinning:

2 Likes

I’m still in Beta, I’m a work in process, just ask my wife. :slightly_smiling_face:

4 Likes

I see; interesting. If I’m understanding the process is:

  • Receive your prompt (your question)
  • Analyze prompt and your PDFs to determine which PDFs are most likely to be relevant. Structure PDF content to either include or prioritize for relevancy
  • Add content from PDFs into a revised prompt, engineered to optimize the LLM’s response to it

So, ultimately it looks like it is limited by the context size. AWS Kendra gives you a 20k retrieval context and will let you pass all of it into the prompt if you want. NotebookLM’s is 5-10x bigger.

There is a fundamental tradeoff between the size of the retrieved content passed into the prompt and how well the LLM can use it and precisely answer, but Google might have additional secret sauce here that would let them perform as well as Kendra on a larger dataset.

At least, that’s my understanding. Hopefully they’ll share more details (and same for how they’re preventing a quality drop-off with the one million token context, assuming they’re doing something other than allowing tenfold the processing vs. 128k.)

That is a great description of the process! The term “Retrieval-Augmented Generation” turns out to be very apt. I believe NotebookLM, or tools like it, could prove highly beneficial for DevonThink users with extensive databases of articles,

Agreed; this approach seems ideal for a folder of documents in DT, ideally with a locally run model. I hope Devon is working on something clever.

Hopefully it stays around and does not go the way of 293 services that were killed.

Google had its notebook before, between 2008 and 2011.

I’m not worried about Google Workspace, it’s making money and gaining some large customers like Airbus and the US Army. And to be fair a few of the services they “killed” were just name changes or upgrades. I got a Grand Central phone number in 2006 and it transferred to Google Voice when they bought the company.

But yes, Google throws a lot of services at the wall to see what sticks. Lately they’ve started bring in outside experts, like Steven Johnson, to work with their developers. Johnson is the author of thirteen books and a long time Devonthink user. Maybe his working on NotebookLM will help them raise their batting average . . . maybe.

1 Like

+1 for Google not just doing the same thing over and over and hoping it turns out differently.

1 Like

AFAIK, it does not depend on the context window size but on the embedding dimensionality. I guess both are related, though. But you could conceivably use different sizes.

Also, that glorious 1 million token context is not exactly cheap, I think I’ve read it costs 0.5$ per request!

Okay, I’ll bite: this account has posted five times in the past hour after being dormant for almost two years, and each post looks like stock ChatGPT output. In particular, this post appears to not have been able to see the video in the OP it refers to: "We Are One" Black History Month Program Celebrating the Negro Leagues - #2 by jackyjoy123

What gives?

3 Likes

I’m confused. This older post is not connected to the video I posted on a separate post yesterday about BHM baseball. What am I missing? Where does ChatGPT come to play in this?

I suspect that jackyjoy123 is posting from ChatGPT in an unusual pattern. Review their recent posts to judge for yourself.

When I wrote “this post” above, I was referring to the one I then linked, not the reply in this thread. (In the thread I linked to, you posted a video in the OP, but they wrote “I’d love to see the video clip”, implying that the text was read but the video was not noticed.)

3 Likes

Yes, that confused me; thank you for the clarification.

1 Like