JSON Security Best Practices: Prevent Injection, XSS, and Prototype Pollution
JSON powers nearly every modern web application, from REST APIs and configuration files to browser storage and inter-service communication. But its ubiquity also makes it a prime attack vector. Improperly handled JSON can lead to injection attacks, prototype pollution, cross-site scripting, denial of service, and sensitive data leaks. This guide walks through the most critical JSON security risks with real-world code examples and battle-tested mitigation strategies.
1. JSON Injection Attacks
JSON injection occurs when untrusted user input is concatenated directly into a JSON string instead of being properly serialized. An attacker can break out of the intended value, inject new keys, or alter the structure of the entire payload. This is conceptually similar to SQL injection but targets JSON-based data flows.
How JSON Injection Works
Consider a server that builds a JSON string by concatenating user input directly:
// Vulnerable JavaScript - NEVER do this
const username = req.body.username;
const jsonString = '{"user": "' + username + '", "role": "viewer"}';
// If username is: alice", "role": "admin", "x": "
// Result: {"user": "alice", "role": "admin", "x": "", "role": "viewer"}
// The attacker has injected an admin role!The same vulnerability appears in Python when developers use string formatting instead of the json module:
# Vulnerable Python - NEVER do this
username = request.form["username"]
json_string = '{"user": "%s", "role": "viewer"}' % username
# Attacker sends: alice", "role": "admin", "x": "
# Result is the same privilege escalationHow to Prevent JSON Injection
The fix is straightforward: always use your language's built-in JSON serialization functions. They handle escaping special characters automatically.
// Safe JavaScript
const data = { user: username, role: "viewer" };
const jsonString = JSON.stringify(data);
// Special characters are escaped automatically
// Safe Python
import json
data = {"user": username, "role": "viewer"}
json_string = json.dumps(data)
// The json module escapes quotes and special charactersAlways validate your JSON output with a JSON validator to confirm the structure matches your expectations, especially when building dynamic payloads.
2. Prototype Pollution via JSON.parse
Prototype pollution is a JavaScript-specific vulnerability where an attacker manipulates the __proto__ or constructor properties of an object to inject properties into the prototype chain. Since JSON.parse faithfully reconstructs all keys including __proto__, any object created after the pollution inherits the attacker-controlled properties.
Example Attack
// Attacker sends this JSON payload
const malicious = '{"__proto__": {"isAdmin": true}}';
// Server merges it into an existing object
function merge(target, source) {
for (const key in source) {
if (typeof source[key] === 'object' && source[key] !== null) {
if (!target[key]) target[key] = {};
merge(target[key], source[key]);
} else {
target[key] = source[key];
}
}
return target;
}
const config = {};
merge(config, JSON.parse(malicious));
// Now EVERY object inherits isAdmin = true
const newUser = {};
console.log(newUser.isAdmin); // true — prototype polluted!Mitigation Strategies
There are several effective defenses against prototype pollution:
// 1. Use Object.create(null) for dictionaries
const safeConfig = Object.create(null);
// This object has NO prototype — __proto__ is just a regular key
// 2. Strip dangerous keys before merging
function safeMerge(target, source) {
for (const key of Object.keys(source)) {
if (key === '__proto__' || key === 'constructor' || key === 'prototype') {
continue; // Skip dangerous keys
}
if (typeof source[key] === 'object' && source[key] !== null) {
if (!target[key]) target[key] = {};
safeMerge(target[key], source[key]);
} else {
target[key] = source[key];
}
}
return target;
}
// 3. Freeze the prototype (defense in depth)
Object.freeze(Object.prototype);
// 4. Use Map instead of plain objects for user data
const userData = new Map();
userData.set(key, value); // No prototype chain involvedSchema validation is another powerful defense. Use a JSON Schema to define exactly which properties are allowed, rejecting any payload that contains __proto__, constructor, or other unexpected keys.
3. Server-Side Injection: NoSQL Injection with MongoDB
NoSQL databases like MongoDB accept JSON-like query objects. When user input is embedded into these queries without validation, attackers can inject query operators to bypass authentication, extract data, or modify records.
Vulnerable Query
// Vulnerable Node.js + MongoDB login
app.post('/login', async (req, res) => {
const { username, password } = req.body;
// Attacker sends: {"username": {"$ne": ""}, "password": {"$ne": ""}}
const user = await db.collection('users').findOne({
username: username,
password: password
});
// The $ne (not equal) operator matches ANY non-empty value
// This returns the first user in the collection — authentication bypassed!
if (user) {
res.json({ success: true, token: generateToken(user) });
}
});Parameterized Fix
// Safe Node.js + MongoDB login
app.post('/login', async (req, res) => {
const { username, password } = req.body;
// Ensure inputs are strings, not objects
if (typeof username !== 'string' || typeof password !== 'string') {
return res.status(400).json({ error: 'Invalid input types' });
}
// Hash the password before querying
const hashedPassword = await bcrypt.hash(password, 10);
const user = await db.collection('users').findOne({
username: String(username), // Force string type
password: hashedPassword
});
if (user && await bcrypt.compare(password, user.password)) {
res.json({ success: true, token: generateToken(user) });
} else {
res.status(401).json({ error: 'Invalid credentials' });
}
});In Python with PyMongo, apply the same principle:
# Safe Python + PyMongo
from flask import request, jsonify
@app.route('/login', methods=['POST'])
def login():
data = request.get_json()
username = data.get('username')
password = data.get('password')
# Validate types explicitly
if not isinstance(username, str) or not isinstance(password, str):
return jsonify({"error": "Invalid input"}), 400
user = db.users.find_one({
"username": str(username), # Force to string
})
if user and check_password_hash(user['password'], password):
return jsonify({"success": True})
return jsonify({"error": "Invalid credentials"}), 4014. Cross-Site Scripting (XSS) Through JSON Data
XSS via JSON happens when JSON values containing HTML or JavaScript are rendered directly into a web page without proper escaping. This is especially dangerous in single-page applications that fetch JSON from APIs and inject the data into the DOM.
The Attack Scenario
// API returns user-generated content
{
"username": "alice",
"bio": "<img src=x onerror=alert(document.cookie)>"
}
// Vulnerable frontend code
document.getElementById('bio').innerHTML = userData.bio;
// The onerror handler executes — cookies are stolen!Prevention Techniques
// 1. Use textContent instead of innerHTML
document.getElementById('bio').textContent = userData.bio;
// HTML tags are displayed as text, not executed
// 2. In React, JSX escapes by default (safe!)
function UserBio({ bio }) {
return <p>{bio}</p>; // React auto-escapes this
}
// Only dangerouslySetInnerHTML bypasses this protection
// 3. Sanitize HTML if you MUST render it
import DOMPurify from 'dompurify';
const cleanBio = DOMPurify.sanitize(userData.bio);
document.getElementById('bio').innerHTML = cleanBio;
// 4. Server-side: escape JSON embedded in HTML
// NEVER do this:
// <script>var data = ${JSON.stringify(userData)};</script>
// Instead, encode HTML entities:
function escapeJsonForHtml(jsonString) {
return jsonString
.replace(/</g, '\u003c')
.replace(/>/g, '\u003e')
.replace(/&/g, '\u0026');
}Always validate your JSON data structures using a JSON validator before rendering, and apply context-appropriate escaping at every output boundary.
5. Denial of Service (DoS) via Malicious JSON
Attackers can craft JSON payloads designed to exhaust server resources during parsing. There are three primary vectors: deeply nested structures, extremely large payloads, and JSON bombs (exponential expansion).
Deeply Nested JSON
// Deeply nested JSON can cause stack overflow during parsing
// 100,000 levels of nesting:
const malicious = '['.repeat(100000) + ']'.repeat(100000);
JSON.parse(malicious); // Stack overflow or extreme CPU usageJSON Bomb (Billion Laughs Equivalent)
// A relatively small payload that expands when processed
// Imagine a 1KB JSON that references itself or triggers
// exponential object creation in application logic
{
"a": "AAAA....(repeated 10000 times)",
"b": ["copy of a", "copy of a", "copy of a", ...]
}
// After processing/expanding: hundreds of MB in memoryMitigation: Size and Depth Limits
// Node.js/Express: Limit request body size
const express = require('express');
const app = express();
app.use(express.json({
limit: '1mb' // Reject payloads larger than 1MB
}));
// Custom depth checker
function checkJsonDepth(obj, maxDepth = 20, currentDepth = 0) {
if (currentDepth > maxDepth) {
throw new Error('JSON nesting depth exceeds maximum allowed');
}
if (typeof obj === 'object' && obj !== null) {
for (const value of Object.values(obj)) {
checkJsonDepth(value, maxDepth, currentDepth + 1);
}
}
return true;
}
// Apply in middleware
app.use((req, res, next) => {
try {
if (req.body) checkJsonDepth(req.body);
next();
} catch (err) {
res.status(400).json({ error: 'Payload too complex' });
}
});# Python/Flask: Size and depth limits
from flask import Flask, request, jsonify
import json
import sys
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 1 * 1024 * 1024 # 1MB limit
def check_depth(obj, max_depth=20, current=0):
if current > max_depth:
raise ValueError("JSON nesting too deep")
if isinstance(obj, dict):
for v in obj.values():
check_depth(v, max_depth, current + 1)
elif isinstance(obj, list):
for item in obj:
check_depth(item, max_depth, current + 1)
@app.before_request
def validate_json_depth():
if request.is_json:
try:
check_depth(request.get_json())
except ValueError:
return jsonify({"error": "Payload too deeply nested"}), 4006. Sensitive Data Exposure in JSON Responses
One of the most common security mistakes is returning more data in JSON responses than the client needs. Passwords, tokens, internal IDs, and other sensitive fields frequently leak through over-exposed API endpoints.
The Problem
// Dangerous: returning the full user object
app.get('/api/user/:id', async (req, res) => {
const user = await db.collection('users').findOne({ id: req.params.id });
res.json(user);
// Response includes everything:
// {
// "id": 123,
// "name": "Alice",
// "email": "alice@example.com",
// "password": "$2b$10$hashedPasswordHere",
// "ssn": "123-45-6789",
// "internalNotes": "VIP customer, discount approved",
// "apiKey": "sk-live-abc123..."
// }
});The Fix: Allowlisting Fields
// Safe: explicitly select which fields to return
app.get('/api/user/:id', async (req, res) => {
const user = await db.collection('users').findOne(
{ id: req.params.id },
{ projection: { name: 1, email: 1, avatar: 1 } } // Allowlist
);
res.json(user);
// Only returns: {"name": "Alice", "email": "alice@example.com", "avatar": "..."}
});
// Alternative: use a serialization layer
function serializeUser(user) {
return {
id: user.id,
name: user.name,
email: user.email,
avatar: user.avatar,
createdAt: user.createdAt
};
}
app.get('/api/user/:id', async (req, res) => {
const user = await db.collection('users').findOne({ id: req.params.id });
res.json(serializeUser(user)); // Only safe fields
});# Python: Use Pydantic models for serialization
from pydantic import BaseModel
class UserResponse(BaseModel):
id: int
name: str
email: str
avatar: str | None = None
class Config:
# Only include explicitly defined fields
extra = "forbid"
@app.route('/api/user/<int:user_id>')
def get_user(user_id):
user = db.users.find_one({"id": user_id})
safe_user = UserResponse(**user)
return jsonify(safe_user.dict())Key rules for preventing data exposure:
- Never return raw database objects directly as JSON
- Use an allowlist approach: explicitly define which fields to include
- Never include passwords, tokens, API keys, or secrets in responses
- Audit your API responses regularly with tools like your browser DevTools or a JSON formatter to inspect exactly what is being sent
- Use different serializers for different contexts (admin vs. public API)
7. Safe JSON Parsing Patterns
Even with all the protections above, your application must handle malformed or malicious JSON gracefully. A missing try-catch around JSON.parse can crash your entire server.
Always Wrap JSON.parse in try-catch
// Safe parsing in JavaScript
function safeJsonParse(input) {
try {
const parsed = JSON.parse(input);
return { success: true, data: parsed };
} catch (error) {
return { success: false, error: error.message };
}
}
// Usage
const result = safeJsonParse(userInput);
if (!result.success) {
console.error('Invalid JSON:', result.error);
return res.status(400).json({ error: 'Invalid JSON format' });
}
// Proceed with result.dataSchema Validation After Parsing
// Validate structure after parsing using Ajv (JSON Schema validator)
const Ajv = require('ajv');
const ajv = new Ajv();
const userSchema = {
type: 'object',
properties: {
username: { type: 'string', minLength: 1, maxLength: 50 },
email: { type: 'string', format: 'email' },
age: { type: 'integer', minimum: 0, maximum: 150 }
},
required: ['username', 'email'],
additionalProperties: false // Reject unexpected fields
};
const validate = ajv.compile(userSchema);
app.post('/api/users', (req, res) => {
if (!validate(req.body)) {
return res.status(400).json({
error: 'Validation failed',
details: validate.errors
});
}
// Input is now guaranteed to match the schema
createUser(req.body);
});Generate your validation schemas automatically using our JSON Schema Generator and then tighten the constraints for production use.
Complete Safe Parsing Pipeline
// Production-grade JSON parsing middleware for Node.js
function secureJsonMiddleware(options = {}) {
const {
maxSize = 1024 * 1024, // 1MB default
maxDepth = 20, // Maximum nesting depth
forbiddenKeys = ['__proto__', 'constructor', 'prototype']
} = options;
return (req, res, next) => {
// 1. Size check (handled by express.json limit)
// 2. Parse check (handled by express.json)
if (!req.body) return next();
// 3. Depth check
try {
checkDepth(req.body, maxDepth);
} catch (err) {
return res.status(400).json({ error: 'Payload nesting too deep' });
}
// 4. Forbidden key check (prototype pollution prevention)
try {
checkForbiddenKeys(req.body, forbiddenKeys);
} catch (err) {
return res.status(400).json({ error: 'Payload contains forbidden keys' });
}
next();
};
}
function checkDepth(obj, maxDepth, depth = 0) {
if (depth > maxDepth) throw new Error('Too deep');
if (typeof obj === 'object' && obj !== null) {
for (const val of Object.values(obj)) {
checkDepth(val, maxDepth, depth + 1);
}
}
}
function checkForbiddenKeys(obj, forbidden, visited = new Set()) {
if (typeof obj !== 'object' || obj === null || visited.has(obj)) return;
visited.add(obj);
for (const key of Object.keys(obj)) {
if (forbidden.includes(key)) {
throw new Error('Forbidden key: ' + key);
}
checkForbiddenKeys(obj[key], forbidden, visited);
}
}
// Apply to your Express app
app.use(express.json({ limit: '1mb' }));
app.use(secureJsonMiddleware({ maxDepth: 15 }));8. JSON Security Checklist
Use this checklist to audit your application's JSON handling. Every item addresses a real attack vector covered in this guide.
| Category | Check | Risk if Missing |
|---|---|---|
| Serialization | Use JSON.stringify / json.dumps instead of string concatenation | JSON injection |
| Parsing | Wrap JSON.parse in try-catch | Server crash |
| Prototype | Strip __proto__, constructor, prototype keys | Prototype pollution |
| NoSQL | Validate input types before database queries | Authentication bypass |
| XSS | Use textContent or framework auto-escaping for output | Script injection |
| DoS | Enforce payload size limits (e.g., 1MB) | Resource exhaustion |
| DoS | Enforce maximum nesting depth (e.g., 20 levels) | Stack overflow |
| Data Exposure | Allowlist response fields, never return raw DB objects | Sensitive data leak |
| Validation | Validate all JSON input against a schema | Unexpected data |
| Transport | Always use HTTPS for JSON API communication | Data interception |
Putting It All Together
JSON security is not a single technique but a layered defense. Every stage of your JSON pipeline, from receiving input to sending responses, needs appropriate safeguards:
- Input: Limit size, validate depth, check for forbidden keys
- Parsing: Use try-catch, validate against a JSON Schema
- Processing: Never trust JSON values in database queries or HTML rendering
- Output: Allowlist fields, never expose internal data, escape for context
- Transport: Always use HTTPS, set appropriate CORS headers
The tools on JSONUtil.com can help you build these defenses. Use the JSON Validator to verify payloads are well-formed, the JSON Schema Generator to create validation schemas for your APIs, and the JSON Formatter to inspect API responses for accidentally exposed fields. Security is an ongoing practice, not a one-time fix. Audit your JSON handling regularly, keep your dependencies updated, and treat every piece of incoming JSON as potentially hostile.
🔗 Related Tools & Resources
Explore these related JSON tools and guides