Cybersecurity

Swag for Critical Bugs: Why It's Exploitation, Not Gratitude

Lisa Anderson

Lisa Anderson

January 27, 2026

11 min read 78 views

Fortune 500 companies are patching critical vulnerabilities like SQLi and RCE while offering only t-shirts as rewards. This deep dive explores why this practice devalues security research, exploits ethical hackers, and what the industry needs to change.

padlock, lock, chain, key, security, protection, safety, access, locked, link, crime, steel, privacy, secure, criminal, shackle, danger, thief, theft

You find a critical Remote Code Execution flaw in a major bank's web application. You report it responsibly through their official channel. They patch it quietly in their next release cycle. And your reward? A branded water bottle and a line on their "Hall of Fame" webpage. Sound familiar? If you've spent any time in the bug hunting community, it probably does.

This isn't a hypothetical gripe—it's the daily reality for thousands of security researchers in 2026. The original Reddit post that sparked this discussion hit a nerve because it articulated a simmering frustration: when billion-dollar corporations treat critical security work like a community service project, something is fundamentally broken. The poster asked if they were being entitled, or if the industry was being exploited. After talking with dozens of hunters and analyzing hundreds of programs, I'm here to tell you: it's exploitation, plain and simple.

This article isn't just venting. We're going to break down exactly why "swag-only" and "glory-only" programs for P1/P2 vulnerabilities are damaging the entire security ecosystem. We'll look at the economic realities, the psychological impact on researchers, and what happens when you devalue the very people trying to keep systems safe. Most importantly, we'll discuss what you—as a hunter, a program manager, or just someone who cares about security—can actually do about it.

The Broken Economics of "Gratitude" in Security

Let's start with the most obvious problem: the math doesn't work. A Fortune 500 company typically has a security budget running into the millions. They pay their internal security team six-figure salaries. They invest in expensive scanning tools, penetration testing contracts, and compliance audits. Yet when an external researcher saves them from a potentially catastrophic breach, the proposed compensation is merchandise worth maybe $30 at wholesale.

This creates a perverse incentive structure. Critical vulnerabilities have real market value—not just on the black market (which we absolutely don't endorse), but in terms of risk mitigation. A single SQL injection flaw in a customer database could lead to regulatory fines under laws like GDPR, loss of customer trust, and remediation costs far exceeding even a generous bug bounty. By offering swag, companies are essentially saying, "We acknowledge you found something valuable, but we don't believe your time and skill have monetary worth to us."

And here's the kicker—this isn't about researchers getting rich. Most serious hunters I know aren't looking for a windfall from a single bug. They're looking for fair compensation for specialized work. When you spend 20 hours reverse-engineering an API to find an authentication bypass, that's billable consultant time anywhere else in the tech industry. To be handed a t-shirt after that effort isn't just insulting—it signals that the company doesn't understand the basic economics of security labor.

Devaluing Work and Driving Talent Away

What happens when you consistently undervalue skilled work? The skilled workers leave. This is Economics 101, and it's playing out dramatically in the bug bounty space. The original poster mentioned auto-skipping any program that doesn't pay cash, and they're not alone. Many top-tier researchers have privately maintained "blacklists" of companies known for swag-only responses to critical findings.

The impact is twofold. First, companies with these policies only attract inexperienced hunters or those who genuinely don't need the money (a tiny minority). The sophisticated researchers who find the subtle, chainable, business-logic flaws? They're over at HackerOne or Bugcrowd programs that pay competitively. Second, it creates a two-tier system where the security of a company's products depends largely on whether they've decided to respect researchers financially.

I've seen this firsthand. A major retail company ran a "Hall of Fame" program for years, receiving mostly low-severity reports. Then they had a breach traced to a vulnerability class that experienced hunters would have spotted immediately. Suddenly, they launched a paid program. But by then, their reputation in the community was already tarnished. It took them years to rebuild researcher trust—and they had to pay premium rates to do it.

The "Hall of Fame" Fallacy: When Recognition Isn't Enough

padlock, locked, secured, lock, old padlock, old lock, rusty, old, close, rust, security, rusty lock, rusty padlock, lock, lock, lock, rust, security

"But we give them recognition!" This is the other common defense from program managers. They point to their shiny Hall of Fame page, maybe with researcher photos and write-ups. For some hobbyists or students building a portfolio, this might have value. But for professional researchers, it's often worse than nothing.

Think about it from a career perspective. In 2026, a line on a corporate website means very little compared to verifiable, paid bug bounty earnings on platforms that track your stats. A Hall of Fame entry doesn't pay rent. It doesn't help when you're negotiating a rate for a consulting gig. And in many cases, companies don't even link to the researcher's professional profiles—so the "exposure" is minimal.

Looking for music composition?

Original scores on Fiverr

Find Freelancers on Fiverr

Worse, some companies use these pages as a way to avoid proper disclosure. They'll list a researcher's name next to a vague description like "identified an input validation issue" instead of clearly documenting the CVE and its impact. This helps the company downplay the severity while giving the researcher almost no credible proof of their work. It's recognition theater—all show, no substance.

The Ethical Slippery Slope and Burnout

Here's the part that really worries me: unsustainable practices create ethical fatigue. When researchers consistently feel exploited, even the most ethical among them start asking dangerous questions. "Why should I report this responsibly if they're just going to send me a hat?" "Maybe I should just publish this publicly and let them deal with the fallout." These aren't greedy thoughts—they're human reactions to being chronically undervalued.

Burnout in the hunting community is real and exacerbated by these dynamics. Imagine spending your weekend digging through JavaScript files, finally crafting a working exploit for an RCE, and then having a company's legal department grill you about your methodology before offering $0 in compensation. The next time you find something, you're less likely to report it at all. Or you might decide to focus exclusively on the handful of companies that treat researchers with professional respect.

This creates a security gap. The companies that need the most help—often those with immature security programs—are the ones driving away the talent that could help them mature. It's a vicious cycle that leaves everyone less secure.

What Companies Get Wrong (And How to Fix It)

door, lock, blue door, rusted, rusty lock, rusty padlock, padlock, closed, rusty, entrance, wooden door, old, wooden, metal, antique, locked

So why do otherwise smart companies make this mistake? From my conversations with security managers, it usually comes down to a few misconceptions:

First, they think bug bounties are "extra" rather than core to their security posture. They view external researchers as nice-to-have supplements to their internal team, not as critical components of their defense. Second, they fear opening the floodgates to low-quality reports if they offer money. And third, they genuinely don't understand the researcher perspective—they've never been on the other side of the submission form.

The fixes aren't complicated, but they require a mindset shift:

  • Budget transparently: If you have budget for a security team, carve out a parallel budget for external researchers. Even a small pool for critical findings is better than nothing.
  • Tier your program: Not every bug needs a cash reward. But P1/P2 vulnerabilities absolutely do. Create clear guidelines: "Critical RCEs: $5,000 minimum. Medium-severity CSRF: Recognition and swag."
  • Respect time: Pay for unique, severe findings that required obvious skill to uncover. If someone saved you from a breach, compensate them like you would any other risk mitigation service.

Some companies have gotten this right. Microsoft, Google, and Apple all run respected programs with clear payout schedules. They understand that fair compensation isn't charity—it's an investment in continuous security testing that would cost far more through traditional consulting channels.

Practical Guide: How Hunters Should Respond in 2026

If you're a researcher tired of the swag grind, here's my practical advice based on what's working for successful hunters today:

1. Screen programs aggressively. Before you invest time, check the platform's policy. No clear payout structure for criticals? Move on. Your time is your most valuable asset—don't donate it to companies that won't respect it.

2. Negotiate before submission. For private programs or direct reports, it's becoming common practice to ask about compensation guidelines before revealing details of critical findings. A simple "Can you share your bounty structure for RCE-level vulnerabilities?" sets expectations early.

Featured Apify Actor

Website Content Crawler

Need clean, structured text from websites for your AI projects? This actor is built for that. It crawls sites and pulls ...

23.1M runs 92.1K users
Try This Actor

3. Build your reputation where it matters. Focus on platforms that track your stats and verify your skills. A strong profile on HackerOne with verified earnings is worth far more than a dozen Hall of Fame entries. These platforms also provide mediation if disputes arise.

4. Consider public disclosure timelines. Most responsible disclosure policies include a timeline for public release if the vendor doesn't respond or compensate fairly. Know your rights and the norms. Sometimes the threat of public disclosure (done ethically) is the only leverage researchers have.

5. Specialize. The hunters getting consistent payouts aren't just running automated scanners. They're diving deep into specific technology stacks, proprietary protocols, or mobile app binaries. Deep expertise is harder to replace with swag.

Common Objections and Real Answers

Let's tackle the counterarguments head-on, because I hear them all the time:

"Researchers should be happy to help make the internet safer!" Sure, and doctors should be happy to heal people. But doctors still get paid, because their skills represent years of training and have measurable economic value. Security research is no different.

"We're a nonprofit/small startup!" This is a valid exception. Many researchers voluntarily help smaller organizations. The original post specifically called out Fortune 500 companies—those with clear security budgets. Context matters.

"Swag is better than nothing!" Actually, for critical vulnerabilities, swag-only can be worse than nothing. It sets a precedent that expert work has no monetary value, which distorts the entire market. Sometimes saying no to swag is the only way to push for change.

"But we get reports anyway!" Yes, from inexperienced hunters or those building portfolios. You're missing the sophisticated attacks that only experienced researchers will find. You're getting a false sense of security.

The Path Forward: Professionalism Over Philanthropy

Where does this leave us in 2026? At a crossroads. The bug bounty industry has matured dramatically in the past decade, but some companies are still stuck in 2015 thinking. The choice is between treating security research as a professional service or treating it as a hobbyist community.

The companies that will be most secure in the coming years aren't necessarily those with the biggest budgets—they're those with the most respectful relationships with the researcher community. They understand that fair compensation isn't an expense; it's an investment in a continuous, scalable security testing workforce that operates around the globe, across time zones, and brings diverse perspectives they could never hire internally.

So to the original poster's question: Are you being entitled? No. You're being professional. And to the companies offering swag for critical bugs: It's time to grow up. The internet's security depends on it.

What can you do today? If you're a hunter, vote with your time. If you're a security manager, advocate for proper bounty budgets. And if you're just someone who cares about which companies get hacked next, pay attention to how they treat the people trying to prevent it. Because in security, you often get what you pay for—and right now, some companies are paying in t-shirts while hoping for million-dollar protection.

Lisa Anderson

Lisa Anderson

Tech analyst specializing in productivity software and automation.