Emails an asynchronous interface is an interesting take on chat AI.
Giving out any email address on the main company name sounds risky. Somebody could register hostname@ or ceo@ ?
The FAQ should explain who the conversation is shared with. The generic "data is stored [encrypted]" is only about storage. Is an uploaded PDF sent to a third-party, possibly to train the their next version?
I always hated the streaming text responses in the chat interfaces, just seemed like a cynical attention-hack to continuously update the content instead of just returning the answer once it was done, so I think email makes a lot of sense, it especially makes slow-running local models a bit more bearable, instead of watching it type at 20 words per minute, just get back to me when you reach a stop token. Plus it can attach any relevant files or files it generates. I wrote a chatbot 8 years ago (chatscript dialog trees) that would gather parameters and run a SQL script, returning the resulting table as a .csv you could download. I wish I thought of doing it as an email character, would have integrated with existing processes way more easily than "login to this service we just spun up whenever you want to use it..."
Agreed, definitely some pros to using LLMs in email vs chat. Feels pretty natural to send an email and get a response shortly after with a notification, etc.
Fair points. We have the ability to restrict any address, so shouldn't be much of a concern on our end to shut down ceo@, etc.
We dont use any data for our own training. We send it to Gemini to process the prompt and encrypt it at rest. Having it encrypted at rest allows us to use it as context in your email thread as more replies come in.
Giving out any email address on the main company name sounds risky. Somebody could register hostname@ or ceo@ ?
The FAQ should explain who the conversation is shared with. The generic "data is stored [encrypted]" is only about storage. Is an uploaded PDF sent to a third-party, possibly to train the their next version?