Google, following Openai's heels, published a policy proposal In response to the Trump administration call for a “national ai action plan. The technological giant supported the weak copyright restrictions in the training of ai, as well as the “balanced” export controls that “protect national security while allowing US exports and global commercial operations.”
“The United States needs to follow an active international economic policy to advocate by US values and support innovation of internationally,” Google wrote in the document. “For too long, the formulation of ai policies has paid disproportionate attention to risks, often ignoring the costs that the wrong regulation can have in innovation, national competitiveness and scientific leadership, a dynamic that is beginning to change under the new administration.”
One of Google's most controversial recommendations refers to the use of IP protected material.
Google argues that “fair use and mining text and data” are “critical” for the development of ai and scientific innovation related to ai. Like Operai, the company seeks to encode the right and rivals to train publicly available data, including copyright data, largely without restriction.
“These exceptions allow the use of copyright materials, publicly available for the training of ai, without significantly affecting the right handles,” Google wrote, “and avoids often highly unpredictable negotiations, unbalanced and long with data holders during the development of the model or scientific experimentation.”
Google, which has trained a series of publicly available data models. <a target="_blank" rel="nofollow" href="https://www.ballardspahr.com/insights/alerts-and-articles/2024/05/google-facing-new-copyright-suit-over-ai-powered-image-generator”>battle <a target="_blank" rel="nofollow" href="https://www.tomshardware.com/tech-industry/artificial-intelligence/youtube-creator-sues-nvidia-and-openai-for-unjust-enrichment-for-using-their-videos-for-ai-training”>demands with the owners of data that accuse the company not to notify and compensate them before doing so. The US courts have not yet decided if the doctrine for fair use effectively protects the developers of the IP litigation.
In its proposal, Google also questions the export controls imposed under the Biden administration instation, which says that “it can undermine the objectives of economic competitiveness” by “imposing disproportionate loads to the cloud service providers of the United States.” That contrasts with statements of Google competitors as Microsoft, who in January said it was “safe” that could “fully comply” with the rules.
It is important to highlight that the export rules forge exemptions for reliable companies looking for large groups of advanced ai chips.
In other parts of your proposal, Google calls “long -term and sustained” investments in fundamental national R&D, delaying recent federal efforts to reduce spending and eliminate subsidies awards. The company said that the Government should publish data sets that could be useful for the commercial training of ai and allocate funds to “I + D of early market”, while ensuring that computer science and models are “widely available” for scientists and institutions.
Pointing out the chaotic regulatory environment created by the mosaic of the state laws of the United States, Google urged the Government to approve federal legislation on ai, including a unified security and privacy framework. Just over two months in 2025, <a target="_blank" href="https://x.com/AdamThierer/status/1898011353807265981″ target=”_blank” rel=”noreferrer noopener nofollow”>The number of ai bills in the United States has grown to 781according to an online tracking tool.
Google warns the United States government against the imposition of what they perceive that are onerous obligations around ai systems, such as obligations of responsibility for use. In many cases, Google argues, the developer of a model “has little or no visibility or control” on how a developer uses a model and, therefore, should not assume responsibility for misuse.
“Even in cases where a developer provides a model directly to the implemers, the implemers will often be better located to understand the risks of downstream uses, implement effective risk management and carry out the postal market
Monitoring and registration, ”Google wrote.
Google also called for dissemination requirements such as those contemplated by the EU “too broad”, and said that the United States government should oppose the transparency rules that require “disseminating commercial secrets, allowing competitors to double products or compromise national security by providing a roadmap to adversaries on how to avoid protections or Jailbreak models.”