

Federal Tech Podcast: for innovators, entrepreneurs, and CEOs who want to increase reach and improve brand awareness
John Gilroy
The federal government spends $90 billion on technology every year.
If you are a tech innovator and want to expand your share of the market, this is the podcast for you to find new opportunities for growth.
Every week, Federal Tech Podcast sits down with successful innovators who have solved complex computer system problems for federal agencies. They cover topics like Artificial Intelligence, Zero Trust, and the Hybrid Cloud. You can listen to the technical issues that concern federal agencies to see if you company's capabilities can fit.
The moderator, John Gilroy, is an award-winning lecturer at Georgetown University and has recorded over 1,000 interviews. His interviews are humorous and entertaining despite handing a serious topic.
The podcast answers questions like . . .
How can software companies work with the federal government?
What are federal business opportunities?
Who are the cloud providers who work with the federal government?
Should I partner with a federal technology contractor?
What is a federal reseller?
Connect to John Gilroy on LinkedIn
https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes?
www.Federaltechpodcast.com
If you are a tech innovator and want to expand your share of the market, this is the podcast for you to find new opportunities for growth.
Every week, Federal Tech Podcast sits down with successful innovators who have solved complex computer system problems for federal agencies. They cover topics like Artificial Intelligence, Zero Trust, and the Hybrid Cloud. You can listen to the technical issues that concern federal agencies to see if you company's capabilities can fit.
The moderator, John Gilroy, is an award-winning lecturer at Georgetown University and has recorded over 1,000 interviews. His interviews are humorous and entertaining despite handing a serious topic.
The podcast answers questions like . . .
How can software companies work with the federal government?
What are federal business opportunities?
Who are the cloud providers who work with the federal government?
Should I partner with a federal technology contractor?
What is a federal reseller?
Connect to John Gilroy on LinkedIn
https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes?
www.Federaltechpodcast.com
Episodes
Mentioned books

Apr 2, 2026 • 31min
Agentic AI Transforms Federal Mission at Scale
Today, we sat down with Paul Tatum, Executive Vice President, Global Public Sector at Salesforce, to hear how Salesforce can help federal agencies reach ambitious goals with Agentic AI. By now, everyone has played around with AI, and possibly some agents. Viewed independently, they can dazzle. Unfortunately, the federal government expects action based on data. If you isolate Agentic AI, you can fall into the trap of lacking the ability to scale, ensure security, and maintain control. In those several weeks, notable technology leaders have jumped headfirst into some agentic offerings from new vendors. What is not reported is that many have jumped back out because of privacy concerns. Salesforce can serve as the "adult in the room," enabling federal leaders to leverage agentic technology in a secure and compliant manner. The good news: agents can connect just about everything. The bad news: agents can connect with everything. In the federal government, one needs trusted, mission-specific data through controlled interfaces. During the interview, Paul provides insight into innovation and security while using Agentic AI in a federal environment. He envisioned future AI evolving from reactive to initiative-taking and personalized, potentially becoming a concierge for citizens. Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Mar 30, 2026 • 29min
Federal HR Modernization: Mapping Chaos to Clarity
Today, we sat down with Charles Fiery from Excella to discuss the complexities of improving federal agency processes. He shared insights on the challenges of process discovery, change management, and data transformation. It is always difficult to assess a large enterprise, whether public or private, to determine how to improve complex processes. One approach is to look at duplicative systems; the federal government provides a notable example. The federal government has evolved into new agencies over the years. Because of technical and legal challenges, they have mostly remained siloed. As a result, we have human resource systems that do remarkably similar tasks. A consolidation effort would reduce costs, improve speed, and assist in interagency collaboration. The OMB mandate requires agencies to integrate core HR functions while maintaining ancillary services like payroll and benefits. The transition involves mapping current systems, identifying essential functions, and ensuring data compliance. Current systems need to ensure the data they provide is accurate and error-free. Each agency has unique data, and structuring that data is important. Visibility into system components is much more difficult. Connectors and integration are complicated by shadow IT and AI. Charles Fiery concludes that although the transition is challenging, completing the necessary groundwork will lead to stable and compliant improvements in federal HR systems. Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Mar 26, 2026 • 22min
Fast Content Delivery and Security in Federal IT
In the 1990's, the World Wide Web was so popular that it was facetiously called the Worldwide Wait. Centralized servers handled a small workload but bogged down as volumes increased. As a result, Content Delivery Services sprang up to distribute the workload worldwide. By 2001, large news organizations could manage unpredictable increases in traffic. The past decade saw a drastic increase in traffic and threats to it. During the interview, Omeed Nosarti describes how companies like Fastly began offering proprietary methods to deliver content faster. Nasrati highlights Fastly's proprietary technologies, such as Smart Parse, which reduces false positives in web application firewalls (WAFs), and its network architecture optimized for low latency and high cache hit ratios. Included in this conversation is the appearance of many remote points on many federal networks. These can function by increasing the attack surface and including the possibility of attacking the Application Programming Interface (API). Nasrati also mentions Fastly's API security features, including schema enforcement and discovery, and its significant ROI in terms of infrastructure and human capital costs. Nasrati emphasizes the importance of real-time traffic analysis and the evolving nature of DDoS attacks. Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Mar 24, 2026 • 20min
Real-Time AI Intelligence and the New Federal Cyber Threat Landscape
In 2026, we are seeing an increase in cyberattacks targeting defense contractors and defense production. Today, we met with Tim Miller, Field CTO at Dataminr, who explained how the company is helping the federal government address this growing threat. Traditionally, cyber threats could be classified as "Zero Day." Essentially, this meant an attack targeting a software or hardware vulnerability that was unknown to the public. They were effective because no security patch existed, and they could bypass defenses. AI has compressed this 24-hour window to minutes. If your opponent is speeding up attacks, then the defender must use similar tools to prevent a breach. Dataminr has developed something called "real-time intelligence." This concept can provide early warnings, help separate nuisance attacks from serious malware, and address today's workforce gap in cyber defense knowledge. During the interview, Miller noted that the company also launched a new product for cyber defense that integrates threat intelligence with internal data. It is called Dataminr for Cyber Defense and leverages AI and Agentic AI to neutralize threats. = = Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Mar 17, 2026 • 22min
How Ethical Hackers Help Federal Agencies Find Hidden Cyber Vulnerabilities
Today, we sat down with Trey Ford from Bugcrowd to talk about ethical hacking. One of the most memorable phrases from ancient Rome is Quis custodiet custodes? (Who Watches the Watchman?). This ancient admonition has direct application to federal cybersecurity. We know federal agencies spend millions of dollars to protect data. How does one ensure the contracted companies are doing their jobs? Traditionally, an organization would use penetration testers, contractors, or basic scanning methods. However, today's attack surfaces are expanding, and malicious actors are innovating so rapidly that we are being forced to consider more creative options. In other words, an annual penetration test against an AI-inspired attack is too focused to be effective. The innovation Bugcrowd brings to the table is a community of researchers who can attack a system from many perspectives. During the discussion, you will learn about federal vulnerability disclosure programs, how to overcome talent shortages, and how Bugcrown vets its research community. Trey Ford also touches on the FedRAMP journey, AI integration, and the evolving cybersecurity landscape, stressing the need for human creativity and dynamic responses to threats. Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Mar 12, 2026 • 24min
Mission-Ready Data: The Front Line of Federal Cybersecurity
The word "deplorable" signals something shockingly bad. Often used for the truly awful or dreadful, Todd Harbour, with decades of federal data experience, applies it specifically to data quality. That may be an overstatement, but the description certainly makes the point that today AI is based on fragmented, incomplete data sets. The bright, shiny thing called AI is so much in focus that federal leaders may not pause to ask what data is being used to train today's models. During the interview, Harbour acknowledges that nobody is seeking perfection here. He has coined the term "mission-ready" to describe the kind of data that should be used for decision-making in the federal government. This would indicate a serious attempt to include siloed and poorly structured data. In a fascinating digression, he refers to MIT's Project Iceberg. This initiative suggests that AI is only the "tip of the iceberg" of its economic impact. The majority are in the future and beneath the surface. If that is the case, the case for mission-ready data is even stronger. Harbour urges immediate initiative-taking measures to confront these challenges and proactively prepare for rapid AI-driven changes to cybersecurity and national defense. Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Feb 24, 2026 • 27min
Fed up with FedRAMP? How Knox Delivers Authorization in 90 Days
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com When people look back on 2025 they will see many changes in the FedRAMP process. It looks like a new administration examined the process, got feedback from companies, and launched new initiatives to speed up the process. During today's interview, Irina Denisenko (Knox CEO) details FedRAMP's challenges and something called "FedRAMP 20x." Knox runs the largest FedRAMP-managed cloud, enabling 90-day authorizations by hosting customers' production environments. Denisenko explains the story of the origin of Knox Systems: she was running a training company and the Air Force wanted to use her product. It would have taken so long to complete the FedRAMP requirements that she just bought a company that was FedRAMP compliant. It is hard to believe that the process is so frustrating that fewer than 500 apps are authorized at moderate/high FedRAMP The initiative from the GSA is called FedRAMP 20x It shifts to continuous monitoring and continuous authorization, moving from annual audits (sampled every 3 years) and monthly CVE spreadsheets to real-time, machine-readable data. What Knox offers is a tried-and-true platform that has reduced time for compliance in order to better serve federal needs.

Feb 17, 2026 • 23min
Fixing FedRAMP: How Automation Cuts ATO Time by 36 Weeks
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com Way back in 2011, one of the goals of FedRAMP was to eliminate software redundancy. The federal government had evolved to the point where one agency would spend millions of dollars on the same application program that the agency in the same zip code had just invested heavily in. The theory proposed by luminaries like Vivek Kundra was to move to the cloud to share services. Reducing cost and improving resilience. FedRAMP was the initiative that established a safe environment for federal cloud use. Companies can comply with regulations outlined in an Authorization to Operate (ATO). Well, fifteen years later, and we are seeing the same duplication not in the application programs, but in the process to get the ATO itself. For example, FedRAMP, RMF, and agency internal policies may require specific artifacts to satisfy one or the other. During the interview, Travis Howerton paints the legacy model—static documentation, annual/3-year audits, spreadsheets. His solution is to have AI assist with documentation, which will drastically reduce compliance time; he cites an example of reducing a process from 52 weeks to 356 weeks. RegScale uses OSCAL (XML/YAML/JSON) to auto-generate RMF artifacts and integrate with SIEMs (Splunk, Elastic), Axonius, ServiceNow, and APIs. Howerton understands the limitations of many automated systems and suggests that a human is a key component after the machine language has assembled the data to make the decision.

Feb 16, 2026 • 26min
Ep 302 API attacks, discovery, and resilience for federal agencies
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com Cybersecurity is a rapidly evolving field, where every effective defense technique is quickly noticed and adapted to by malicious actors. The real question is how fast each side of this ongoing cat-and-mouse game can respond. Let us take an example of web applications. In the decade-long slog of the cloud, federal users migrated to web-based applications protected by Web Application Firewalls (WAFs). firewalls. As that method matured, malicious observers noted that the Application Programming Interface (API) allowed these software programs to communicate and exchange data. Voila, another attack vector was born. During today's interview, Joe Henry from Akamai Technologies notes that 80% of their customers report API attacks. Henry details a curious term called "Broken-Object Level Authorization." In this attack, an application fails to check if a user is authorized to access specific data objects. The ID is manipulated, and the malicious actor gets access. Akamai's API Security performs behavioral analysis beyond WAFs, flags PII exposure, and supports a zero-trust posture. Software developers talk about a "shift left"; we apply that to the Akamai approach. They have a worldwide network of Points of Presence (POPs) and data centers where they can observe attacks as they develop. It is so strong that it provides fail-open resilience with a 100% SLA. Akamai provides a State of the Internet Report (quarterly). If you would like to stay connected with the next manifestation of attack, consider subscribing or visiting their website to stay informed about the latest trend

Feb 10, 2026 • 20min
Ep. 301 Edge Computing for Government: Rancher's Role in Secure Hybrid Federal Environments
Twenty years ago, the concept of Bring Your Own Device (BYOD) entered the federal IT landscape with the advent of network-connected devices like Blackberries—sometimes even within secure federal networks. This slow start has exploded into a federal information technology system with sensors on satellites, submarines, and everywhere in between. That "in between" can include on-prem networks, multiple clouds, and hybrid clouds. Today, we sit down with Ryan Leiws, the CEO of Rancher Government Solutions, to look at some of the challenges in managing this dispersed environment and how to manage it. Lewis describes how Rancher connects hybrid environments using containers and Kubernetes for secure orchestration. Lewis emphasizes continuous compliance and DevSecOps via Rancher's Carbide stack, SBOM-level visibility, and rapid recovery in contested, denied/disconnected/intermittent/limited (DDIL) environments. Lewis notes that Rancher's declarative stack reduces maintenance and allows simple app redeployment. They also emphasize portability, cost efficiency, and alignment with zero-trust principles, with upcoming hardened features. = Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com


