It was nice listening to Sam Altman’s presentation on the openAIs dev day.
NLP and its use cases have come a long way, once just limited to Google’s autocomplete, things have moved to a much better contextual conversational space, and also that the ecosystem is mature enough to understand the bigger use cases.
I believe, by now, a lot more things are already developed but will be timed accordingly (or maybe never ). However many developers are facing issues with the rate limit exceeded even before they use the APIs as if developers are being allowed in a phased manner based on on openAIs roadmap.
Enterprise data training may continue to be a roadblock for cloud-based openAI’s offerings. Eventually, it may come down to an in-house chip-based edge solution.