BTCC / BTCC Square / decryptCO /
Sony Wants Its Own Crypto Bank Too - Here’s Why It Matters

Sony Wants Its Own Crypto Bank Too - Here’s Why It Matters

Author:
decryptCO
Published:
2025-10-15 16:34:12
13
1

OpenAI Forms Well-Being Council as Altman Eases ChatGPT Adult Content Restrictions

Sony joins the banking revolution—crypto style.

The Corporate Crypto Gold Rush

Another tech giant wants a piece of the digital finance pie. Sony's latest move signals mainstream adoption isn't just coming—it's already here. Traditional banking models watch out.

Why Crypto Banking?

Companies aren't just dabbling in crypto anymore. They're building entire financial infrastructures around digital assets. Lower fees, faster transactions, and global reach make traditional banking look like ancient history.

The Regulatory Dance

Getting approval won't be easy. But when has that ever stopped innovation? Financial authorities worldwide are scrambling to keep up with tech companies rewriting the rulebook.

Market Impact

More institutional players means more legitimacy. More legitimacy means higher valuations. The cycle continues—until someone figures out how to properly value these things beyond 'number go up' mentality.

Because what's better than making gaming consoles? Controlling the money that buys them—with a healthy side of regulatory headaches and volatile assets, naturally.

AI and public safety

OpenAI and other AI companies have faced increasing pressure over how the technology influences users, particularly children. The company, along with others in the industry, have faced lawsuits from parents alleging that AI conversations contributed to teen suicides, prompting OpenAI to introduce parental controls earlier this year. Others have blamed dependence on the chatbots for breakdowns in relationships and increased isolation.

The creation of the well-being council following public pressure highlights an ongoing issue in the tech sector where companies only confront the psychological and ethical consequences of products once they are already in mass circulation. Critics say it’s a familiar cycle of innovation first, accountability later.

“This seems part of the usual pattern of MOVE fast, break things, and try to fix some things after they get embarrassing,” a spokesperson for NGO AlgorithmWatch told Decrypt.

AlgorithmWatch also questioned how independent the new council WOULD be, noting OpenAI’s history of internal power struggles. “We should remember that when the previous OpenAI board tried to have an impact by expressing their distrust in Sam Altman, they were all removed and replaced,” they added.

AlgorithmWatch suggested that a ““slightly better (but still limited) precedent” might be the Meta Oversight Board, noting that while their recommendations are “often very slow and frequently ignored,” the board provides “clear recommendations and public enough that people can see what Meta is ignoring.”

Shady El Damaty, co-founder of Holonym and a digital rights advocate, told Decrypt he found it “ironic” that the same companies racing to deploy the most powerful AI tools are now positioning themselves as “moral referees.”

“But these conversations are urgent and overdue, so I won’t knock the existence of the council. If anything, I hope it raises the floor for everyone,” he added.

In addition to wellbeing, he said he’d like to see the Council address issues around privacy and identity too. “At a minimum, the council should establish transparent and public metrics for measuring AI's emotional impact and mandate regular, independent audits,” he said. “But we really need hardened rules and regulations that protect our digital rights, and we need them sooner rather than later.”

OpenAI's Expert Council "has a chance to go deeper than safety," he added. "They should be asking: What rights do people have in digital spaces? Who owns their identity, their behavior, their likeness? What does human-first design actually look like… not just for kids, but for everyone?”

OpenAI eases adult content restrictions

The company’s renewed focus on well-being also coincided with CEO Sam Altman’s announcement that OpenAI will also begin relaxing restrictions, including on adult content, come December.

“We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues," Altman tweeted on the same day.

"We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right."

Altman said verified adults will soon be allowed to create erotica using ChatGPT, describing it as part of a broader principle to “treat adult users like adults.”

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.Your EmailGet it!Get it!

|Square

Get the BTCC app to start your crypto journey

Get started today Scan to join our 100M+ users

All articles reposted on this platform are sourced from public networks and are intended solely for the purpose of disseminating industry information. They do not represent any official stance of BTCC. All intellectual property rights belong to their original authors. If you believe any content infringes upon your rights or is suspected of copyright violation, please contact us at [email protected]. We will address the matter promptly and in accordance with applicable laws.BTCC makes no explicit or implied warranties regarding the accuracy, timeliness, or completeness of the republished information and assumes no direct or indirect liability for any consequences arising from reliance on such content. All materials are provided for industry research reference only and shall not be construed as investment, legal, or business advice. BTCC bears no legal responsibility for any actions taken based on the content provided herein.