Note: This summary is AI-generated and may contain inaccuracies, errors, or omissions. If you spot any issues, please contact the site owner for corrections. Errors or omissions are unintended.
This is the first episode of Null Charcha, a panel discussion featuring Anant Shrivastava and Prashant discussing “Security Then and Now” - comparing security practices from 10+ years ago to the present day.
Panelists
Anant Shrivastava: Close to 15 years experience, trainer, pen tester, manager/director, maintains open source projects (Android-focused, code review), maintains hackingarchivesofindia.com portal tracking Indians in hacking space
Prashant: Known as “corrupt” in community, ran NULL jobs portal for 10 years, started from defensive side, moved to offensive security through NULL, pen tester, trainer, currently director
Personal interests: Anant - data hoarder (20TB), anime binge watcher (One Piece episode 352); Prashant - likes hikes and bushwalks
Key Topics Discussed
Security Then vs Now:
2000-2010 Era:
Hacking was fun: People were doing it because it was exciting, wanted to experiment
Finding vulnerabilities: Way easier, simpler - organizations struggling to identify what security professionals were doing
Straightforward attacks: What used to be straightforward attack in 2010 might now be thwarted by browser, application framework, then application code
2010-2022 Era:
Established job opportunity: Has become commercial enterprise, money-making opportunity
People here for money: Not because they wanted to be here - changes perspective
Nine-to-five: For many, infosec is effectively nine-to-five earning opportunity
Person excited vs day job: Person in exciting place might spend 2 days exploring landscape vs person earning day job just finishing and moving on
Default protections: Things more complicated, in-depth - default protections have come into place
Entry Paths:
Earlier: Pathway from sysadmin or other role, eventually moved into security
Now: Can start directly in security from college
Problem: People starting directly at security level - gaps in knowledge/skill set because never learned from ground up
Reality: Still tough for freshers to get into security even today, most still follow path where sysadmins pivot to security
Certifications: Comparatively easier now - few certifications, some companies might get you in job
Tools and Automation:
Scanners:
Always meant to be aids: Not replacement for manual pen testing
Evolved: More useful now - earlier scanners were person trying to automate, fewer plugins/checks
Still identify low-hanging fruits: But no replacement for manual testing for business logic flaws
Fringes then and now: Some people happy taking scanner output, saying “done VAPT” - only VA part done, compliance check sorted
Organizations looking to mitigate: Still rely on manual approach
Manual Testing:
More prominent now: But was prominent before also
Earlier: People doing manual, then writing automations around it
Now: More automations in place, people have more time to focus on stuff needing manual attention
Business logic: Manual will always be there because business logic issues
Till security is art: Till information security is considered art, human required
If/when move to science: Humans of higher caliber required, most could be automated
Bad Actors:
Evolution:
More sophisticated attacks: Bad actors have gone from then till now
Conti malware playbook: Accidentally released online - much more structured playbook compared to large number of pen testing organizations
Incentive structure: Bad actors have far bigger incentive to do things right - going to earn big chunk of money, punishment if anything goes wrong
Much more organized: Much more structured compared to person doing VA/PT
Time interval: VA/PT there for fixed time interval, can throw ball on other side saying “time constrained”
Bad actors: Have embraced every single thing you can think of
Defense Evolving:
Defense evolving because offense evolved: Offense is automated, offense has taken up part of doing deed in shorter span of time
Defense has to evolve: To counter those offenses
Both sides complementing: Each other
ODays dropped: Within few minutes see attacks going on - defense still not even got update saying “audit dropped”
Cat and mouse game: One team lagging, other one pulls along
Logging and Monitoring:
Logging:
Problem solved: Now have access to so many resources - collecting logs, performing analysis much easier
Still lacking: Context behind logs - have to correlate, piece together what happened based on limited information
Logging predominantly solved: Resources, tools available (AlienVault, ELK stack) - problem is where to store so much logs
Logs too much: Person generating 1TB logs per day - 180 day retention = 180TB logs
Problem: Not that logs not there, not that logs can’t be connected - where do you store them, processing comes way after
Monitoring:
Context piece still problem: Uniformity of logs, sync between logs
Government directive: Asking organizations to sync time clocks - organizations might have different clock set, servers not syncing with NTP servers
5am attack might be 5pm: In Indian Standard Time
PII in logs: Large amount of data being stored - becoming very common to have PII or stuff that should not be stored in logs getting stored
Application crash dump: Contains xyz details - those sort of problems people trying to solve
Mindset Change:
Attack happening/will happen: Mindset converted that attack is happening, attack will happen, you will get hacked, or you are hacked already and don’t know
Security before attack: Less interesting - knowing when attack happened, what impact more requirement
Have to assume: Attacks might happen - in that context logging/monitoring plays important
Defensive Evolution:
Frameworks:
Security by default: Standard functions marked with deprecation warning or clear warning - should use other function with filters by default
Default functions have filters: Insecure function available if want to risk going insecure way
By default frameworks providing security
OS Level:
Android/mobile devices: Don’t even give by default root level access to device owner - even though owner, still not master of device
Limits attack surface: Way lower
Desktop protections: Memory safe executions, avoiding use after free, other cookie canaries being added
Windows, Linux, Mac: Everywhere aim is to thwart attacker from getting easy execution
Automation:
Defensive side automation: Blue team utilizing defense automation to defend systems
Definitely helped
Broken Access Control:
Why Always There:
OWASP Top 10: Broken access control always there since 2003
Not just web: Infrastructure, AD side, other technologies - always major issue
Balance: Security always balance between security itself and convenience of user
Make system difficult: User will find bypasses around security controls
Balance not found: Properly - leads to different problems like bypassing authentication
Amalgamation:
Multiple issues: Broken access control is amalgamation of multiple issues (like injection flaws - SQL, XSS, XML, LDAP)
Identity and access management: Critical piece - no one wants to take ownership, that’s where everything goes wrong
Azure AD environment: One user permission incorrectly given can lead to catastrophe
Cloud environment: Shadow admins - not admin on face but can become admin or make someone else admin because of specific permission
Easy to screw up: Very difficult to identify where you went wrong
Automation gap: Still big gap - not easy to automatically detect something went wrong
Right call at one point: Later in lifecycle something else changed, decision now causing problem - exploitable situation
Frameworks trying to solve: But still large gap needs to be filled
Funny Vulnerability Stories:
Prashant’s Story:
Client very hip: In kickoff meeting saying “got all security controls in place”
Just tried payloads: In login page - there was XSS injection there
That still happens: Sometimes
Anant’s Stories:
Vendor web applications: 2016-17 time frame - if certain vendor created web application, would try blind SQL injection on login page, have access to entire database
Big application space: Client had applications for entire European region and Asian region - every country had own dedicated websites
Testing one website: Given blank slate, started with one website
Forgot to block SVN: Repository - copied over whole .SVN, forgot to block it
Got usernames: From .SVN folder config file - contains usernames used to commit, URL of SVN repository
SVN repository publicly accessible: Username was the password
Got access: To entire SVN repository - about 50 different repositories for each country
Dropped PHP file: Shell was live in like 5 minutes - had whole CI/CD automation, if file added to source code automatically gets deployed
Within hour: Had shells on each and every website - different servers, different AWS accounts, different setups
How deep to go: Do we want to stop at one point or keep going in
Other Stories:
AD restrictions, AV restrictions: Try figure out bypass around them, running around trying figure out what works
Pivoting across network: See what clicks where
Nessus scan: Started, took 15 minutes, looking at nmap output - saw 8080 open, Tomcat there
45 minutes done: Inside Tomcat because of default credentials, got shell, credentials dumped, moved to other machines, common passwords
Within one hour: Whole pen test done - were AD enterprise admin
Fun to see: But from defense point of view - why mistakes happening
Silly scenario: Exploited commonly known vulnerability, asked “why still unpatched?”
Response: “Give me two minutes, I’ll show you I patched it”
Problem: Change control not allowing restart - patch deployed but couldn’t restart, patch not in effect
Ground reality: Company can’t do anything on first month of quarter (critical), last month of quarter (critical), first week of month (critical), last week of month (critical)
Effectively: Two into four = eight weeks of time where can do any change on environment
Defense lives with: Need to serve business, business needs to function - balance has to be maintained
Open Source:
Evolution:
Earlier: Open source code was big no-no
Changed: Today proprietary softwares have open source code in them
Why commercial software required: Open source is somebody doing work, making available to world - proprietary specifically designed solving exact problem company requires, with additional support
Additional support: Never there or generally not easily available for open source component
Open Source Term:
Earlier: “I am writing something, want to show world - if you like it contribute, if don’t like it fine, if feel going wrong direction submit patch or fork it”
Licensing: Very restricted, very few licenses, very clear about freedom
Basic tenets: Freedom of access to code, freedom of tinkering, freedom of manipulating
Now: Effectively way of building resumes - need 4-5 different open source projects for good job opportunity
Corporates: Show “we are not bad players, see we open source stuff”
Point: Code available, want to do something with it your call, we are going to do things our way
Other side: “We made something fantastic, here it is, contribute as much as you can so later can take contributions and make proprietary code”
Checks and balances: In place but pretty much that’s way corporates look at open source now
Open Source in Corporate:
Bigger team: More complicated to create even tiniest piece of software
Relying on: Code written by smaller teams - smallest entity is single person (open source code)
Reduce time to market: Relying on open source code as starting point
Cascading dependencies: Building something, want to use one module (authentication) - use OAuth module, have to write everything from scratch
Write from scratch: Conventional knowledge - when write everything from scratch end up causing more problems
Open source components: By virtue of being open source more comfortable using other open source components
Not third party: Talking about fourth party, fifth party, up to eleventh, twentieth, nth party dependencies
Everyone using: Each other’s open source components
Very few paying attention: To what they’re using - where SBOM directive mandate came from
Third party we don’t care: Whatever version is there is there, can’t do anything about it, need to inform third party
Source code reviews: People would refrain from looking at third parties - that’s problem
Composition: Web softwares - about 80-20 (80% open source components not written yourself, 20% what you are writing)
Desktop software: Much more proprietary and hidden - players still use own frameworks, own stuff
Flutter, Electron: Things going in same direction - desktops relying on open source components
Same problem: 70-80% of code from open source world
From days: “I don’t want open source code in codebase because license” to “can we put barriers, use API, ensure GPL2 not GPL3, then okay using it”
People evolved: Found ways to bypass restrictions - everyone gung-ho about open source in corporate, don’t see it going anywhere, see it growing more
SBOM (Software Bill of Materials):
Not issue per se: More of “knowing your own” - need to know what is there in environment
Existed for long time: A lot of people said “this is silver bullet”, prominent people debunked - SBOM not going to solve all problems
SBOM is first step: Effectively knowing what you have - like inventory
Inventory never accurate: Just like SBOM might not be accurate most of time, but gives indicator around what is there
Journey starts: From SBOM - not the end, should not be solution to problem, just start point
Once know what there: Need to start looking at attack surface reduction, version upgrades, why are we using this
npm example: is-even module, is-odd module - one depends on other, thousands depending on this one piece
Till dependencies there: People not realizing “to save two lines of code I have added whole module, module owner as trusted entity” - things going to be tricky
SBOM just start: That’s where you start with journey
SBOM Effectiveness:
Effective if: Every entity in line is maintaining SBOM - everyone gets cascaded
If no obfuscation: Easy to automatically identify second or third level dependencies
npm done good job: If installing npm modules, has further dependencies down line, npm system keeps track of all dependencies - get list of all dependencies (not just direct but second or third level)
Holistic idea: What kind of packages using
Problem gets complicated: When have open source component depending on open source component but depending on component written in different language
npm component: Depending on something written in Go, which depends on something written in Python, which depends on something written in Perl
Multiverse of madness: Not just one way of tracking, different ways of identifying modules going to be there
That’s where problems: Can come up, people still facing, trying to solve
Operating Systems:
OS Does Not Define Hacker:
Modern OSes interlinked: Does not matter which OS you use to hack
Windows: Have Linux subsystem for Linux
Linux: Have all different tools and capabilities needed
Cross-platform tools: Majority of tools pen testers use are cross-platform (Windows and Linux)
Don’t get stuck: Should know basics of all of them
Corporate: Windows still most relevant OS from end user perspective, but when comes to servers Linux is uncrowned king
Depending on what attacking: Going to come across all of them - why focus on specific OS
Use what works best: In current scenario and get done with it
Scenarios:
Interesting tool: Allows hack into Azure systems - Windows only, works on PowerShell
PowerShell on Linux/Mac: Not able to handle it - need Windows machine
If Windows knowledge: “I don’t know how to start Windows or how to start terminal” - that’s bad as pen tester
Dealing with lot of systems: As pen tester, auditor, forensic investigator - dealing with lot of systems, esoteric environments
Important: Know how to deal with these systems at basic level
Extend: Not just operating system - even to tool sets (vi vs emacs, sublime vs visual studio code)
Handle tool: That is in front of you at very basic capacity - should not become bottleneck
After pandemic: People stuck up - “I can’t sit on this system, need my mouse, my keyboard” - that should not be case
By nature of job: More like tenants on someone else’s system than doing something on own systems - have to be flexible
Blurring Boundaries:
Windows machine: WSL2 gives nearly replica of Linux kernel environment
Mac OS: Anyways because of nature of how Mac and Unix/Linux interrelated, could run more softwares from Unix/Linux background
Apple M1/ARM: Moving into ARM territory, getting parallels into Android and iOS world
Linux: Have been ARM system, have been x86 system
Mac blurring lines: On ARM aspects
Windows blurring lines: On x86, Windows and Linux kind of systems
Does not matter: Which system in front of you - should be able to deal with it, move forward
Should not be bottleneck: Just facilitator to get to command prompt, get to actual application
IoT hacking: When talking about IoT environments, will be restricted with what can do within different environments
For most: Linux would be ideal environment - tap, dissect, do bunch of things
For other environments: Might simply be too big deal to get emulator running on Linux - might just be one click away on Windows
Don’t be stuck: On environment, rather focus on getting things done
Security Research:
Term:
Niche term: People used to define themselves
Everyone researching: In order to do jobs, in order to attack any application as pentester
Level of research: Different - people who work in Project Zero at Google, entire job is find zero days, disclose bugs
Some researchers: Focus on specific type of library, go into whole lot of depth
Term hasn’t changed: That much - just people now self-coined it themselves instead of being given to them
Research Difficulty:
Term research: Actually researching - lot of people take on face value
Big because: So much content available now compared to initial days
Initial days: Someone would write something based on own experience
Now: Lot of people writing about own experiences
People like me and Prashant: To be blamed - started putting this as barometer of good researcher, that you would have your own content
Kick-started: Everyone started putting out own content
Overabundance of content: Same thing explained in 20 different directions by 30 different people
Not difficult: To actually have content out in public
Difficult: To identify what is good, what is bad
Research Then vs Now:
Old days: “Let me search something, I am not going to find what looking for, will have to make do with whatever can find”
Now: “I found 200 articles, let me figure out what are they actually talking about, may end up getting that one thing after reading 200 articles because all nudging towards one point but not actually talking about that one point”
Research now more difficult: Because there is more vs content than actual content needed
Not BS content: People writing BS content to gain popularity
Corporate world, SEO world: Realized this kind of content gives better market value
People writing content: Nowhere related to community, more related to environment - putting content out so SEO brings more traffic
Don’t care: What you do after read that content - just there to grab traffic, show ads, earn few cents
Quality Research:
Project Zero: People doing deep dive technical research
Spending hours, days, months: Trying figure out solution to problem - that’s quality research
People writing anything: Not saying don’t do that - saying do that as much as can, more good content going to be there
If more people genuinely interested: Write about it, more genuine content going to be there
Problem increased: For researchers - content online which was base of how should go about doing it has increased manifold
Advice for Quality Research:
One: Write - need to have habit of writing
Two: Explore what is available right now, find out what people are not focusing on, then deep dive on that
Anything people focusing more on: Area already being explored by n number of people - chances of getting something unique less
Looking at something ignored: People not looking at - high chances
Example: James Kettle (Albinowax) - every year comes out with interesting research, everyone in web application space gaga over it
All he is doing: Looking at HTTP headers - has not even reached HTTP body, still dealing with HTTP headers, just deep diving into it where no one else is looking at
Key: Look at something which no one else is looking at, go big
Be patient: Lot of times what going to find is something already someone else spotted - that’s fine
Bug bounty hunters: If found duplicates, you’re on right track - at least found something valuable to be marked as duplicate
Duplicate is fine: Finding something anyone already discovered is fine - keep going, keep looking at things that are areas people not exploring much
Future of Security:
Web3:
Term evolved: Somewhere - knew about web3 term back in 2014-15 time frame, but that was not web3 we know of right now
Web3 at that time: About semantic web - web environment where machines can talk with machines, make sense of content
Semantic web: Got lost, web3 term got lost, stuck with web2 (consumer-producer setup)
2018: Web3 started getting back into traction with cryptocurrency and blockchain space
Within web3: Bunch of different interesting things happening
Most people here: Not about technology - if start talking about “how does IPFS work” - people will be like “yeah you deal with that, tell me which coin going to rise tomorrow”
Move towards tech aspects: Going to be interesting
Web2 Attack Patterns:
Web2 not going anywhere: Anytime soon
Broken access control: Always going to be there
Single page applications: More people moving towards - idea that will not leverage server side language, rather rely on API
Problems still going to be there: Just have to look at it from slightly different angle
AppSec vs NetSec:
NetSec slowly becoming absent: Modern environments being built over cloud
Infrastructure as Code: Config as Code setup - don’t actually deal with device themselves
Software stacks more open: Moving towards more HTTP, TCP kind of structures
If want to make money: AppSec is way to go
If already good with NetSec: Be that one person among millions who actually knows how to deal with specific device
Advice for New Web Application Security Testers:
Don’t aim for big players: At start
Finding duplicates is good: Means in right direction
Look at bounties with no money: Less competition
Not focus on one specific bug type: Rather explore as many bug types as possible for corporate career
If trying make money via bug bounty: Dig deep into one specific issue
Key Insights:
Security then (2000-2010): Hacking was fun, exciting, easier to find vulnerabilities
Security now (2010-2022): Established job opportunity, commercial enterprise, nine-to-five for many
Entry paths: Earlier from sysadmin/other role, now can start directly from college
Tools: Scanners evolved but still aids, manual testing always needed for business logic
Bad actors: Much more organized, structured, bigger incentives
Defense: Evolving because offense evolved, logging solved but monitoring/context still problem
Broken access control: Always there, balance between security and convenience, easy to screw up
Open source: From big no-no to 80% of code, cascading dependencies, SBOM just start
OS: Does not define hacker, modern OSes interlinked, don’t get stuck on one
Research: More difficult now due to overabundance of content, look at what people not focusing on
Future: Web3 interesting space, Web2 not going anywhere, NetSec slowly becoming absent, AppSec way to go
Actionable Takeaways:
Security then was fun/exciting, now is commercial enterprise
Entry paths changed - can start directly but still tough for freshers
Scanners evolved but still aids - manual always needed for business logic
Bad actors much more organized - bigger incentives
Defense evolving because offense evolved
Logging solved but monitoring/context still problem
Broken access control always there - balance between security and convenience
Open source from big no-no to 80% of code
SBOM just start - not solution, just knowing what you have
OS does not define hacker - don’t get stuck on one
Research more difficult now - look at what people not focusing on
Web3 interesting space but rediscovering old bugs
NetSec slowly becoming absent - AppSec way to go
For bug bounty: Dig deep into one area if want money, explore multiple if want corporate job
Learn what and how attack works - don’t just fire payloads