"It’s pretty simple: Google Meet (original) was previously Meet, which was the rebranded Hangouts Meet. Meet has been merged with Google Duo, which replaced Google Hangouts. Google Duo has been renamed Meet, and Meet has been temporarily named Google Meet (original), for clarity"
I tried it to summarize some lecture videos. And the summary ranged from average to bad. Nothing I couldn't already get from the description. Even ChatGPT 4o spits out far better content.
So far my method has been to take the transcript and use an LLM with customized prompt for summary.
How do you inform your relatives who are fascinated by this feature and use it on the road? I am scared for them, but I dont know how to educate them. I also dont want to come across having an Anti-Musk Bias.
Why would anyone care about showing an anti musk bias? Not everyone has to like the guy
Show your family the articles. Show them the videos of musk promising FSD by 18/19, and somehow it’s still not here in 2024… then you let your family make their own choice. If they want to use FSD and endanger themselves and others then it is out of your control
Why would their safety be affected by future-looking statements? Has Elon Musk ever said that Tesla has now perfected self-driving and that driver attention is not required?
I try to think about what should be a framework and what should be a library. Libraries are tools that helps you achieve a task, for example, building a prompt, calling LLM models, communicating with vector database.
Frameworks are more process driven for achieving a complex task. This is like ReactJS with their component mode -- they set a process for building web application such that you can build more complex applications. At the same time, you have lots of flexibilities in the implementation details of your application. Framework should provide as much flexibilities as possible.
Similarly, we are trying to build our framework for streamlining the process for LLM development such that you can iterate on your LLM application faster. To help setup this process, we enforce very high-level interfaces for how you build(input & output schema), evaluate, deploy your application. We provide all the flexibilities to the developer for low-level implementation details and ensure it's extensible so you can also use any external tools you want within the framework.
So it's a pretty simple wrapper of LLM model in use (currently gpt-4o), it does not add much technical stuff in it.
It does not use database for any "random search", but yes, columns.ai is a data analytics tool that allows you to connect supported live data sources like Google Spreadsheet, Airtable, Notion Database to create visual stories.
The analytics engine is home built (https://github.com/varchar-io/nebula) but it is not a database. And I don't use LLM agents, just build logic how to purify data returned by LLM, and fit them into an optimized visualization.