Many friends already use ChatGPT for their daily work, this for sure will be a revolution but not sure in a good or bad way.
I'm using it on a daily basis as well. It's pretty good. My use cases are:
1. Generating basic scripts to solve problems I have. Usually the right bits get thrown together and I just have to fix up a few things. It's like having a junior developer at hand.
2. Filling in blurb in powerpoint presentations that I have to do. I also used one to write a reference for someone recently as an experiment.
3. Consolidating info into one place. It's pretty good at coming up with rational answers to comparison questions for example on high level topics or solutions or rationale to decline or approve an idea.
I'm not sure that the end game for this class of technology is going to be good however. This is just the first competent implementation rather than a big technology demonstrator. Judging by how the human race handles new technology generally I expect this will be used mostly for bad.
What I'm expecting to see out of it is an arms race between the big technology companies. I've written about this in detail somewhere else which I won't link here as it's directly tied to my real identity. Fundamentally this technology can be weaponised very easily. Content generation is the hill on which we die. It's already a major problem. Google are having trouble discriminating between ML content farms and quality sources and even the big news agencies and vendors are using it to generate content. Google of course need to step up their game there so they will use competing technology, just announced, to do the same thing. Ultimately knowledge seeking will be divided into a few large peers who will be building their training and information corpus on top of content generated by other tools doing the same thing. Much like a jpeg that has been encoded over and over and over again, errors will creep in and the outcome will be a signal-to-noise ratio which completely compromises the whole system and makes all sources of information useless.
This will lead to an Idiocracy style dark age of decaying information. Of course this situation will be manipulated as people work out how the models work, leading to politically controlled outcomes and after some horrific human sacrifice as always, the technology will be regulated. Following from this, there will be a new enlightenment era which will involve carefully curated and moderated content, much like an old fashioned encyclopaedia (or Encarta!). Not wikipedia though. That will have burned with the rest of the world.