In short, the answer is ‘no’. The full rationale is here, but here is an excerpt from their policy:
Nonhuman artificial intelligence, language models, machine learning, or similar technologies do not qualify for authorship.
If these models or tools are used to create content or assist with writing or manuscript preparation, authors must take responsibility for the integrity of the content generated by these tools. Authors should report the use of artificial intelligence, language models, machine learning, or similar technologies to create content or assist with writing or editing of manuscripts in the Acknowledgment section or the Methods section if this is part of formal research design or methods.
This should include a description of the content that was created or edited and the name of the language model or tool, version and extension numbers, and manufacturer. (Note: this does not include basic tools for checking grammar, spelling, references, etc.)
A key limitation of ChatGPT for academic work is that the source of statements are not citable. However, that may soon change (see AllSearch.ai as an early example).