Before generative AI captured the public imagination, most artificial intelligence models were private, designed to be used by a specific entity using that entity’s data. Now many organizations access public AI models that pull data from multiple sources, while others are looking for ways to leverage the benefits of large language models in private environments. Private AI keeps an organization’s data secure and keeps the model pristine. But if the model relies on off-premise processors, a private connection will be needed. This creates new opportunities for edge infrastructure and secure connections to and from that infrastructure. As private AI models evolve from training to inference, more organizations may want private wireless networks to handle the data these models will generate.