Every IAM migration eventually hits the password problem. Users have passwords stored as cryptographic hashes in the old system. You need those users in the new system without forcing all of them to reset their passwords on Day 1. Depending on the source and target platforms, this ranges from straightforward to genuinely painful.
The Core Problem
Password hashes are one-way functions by design. You can’t reverse a bcrypt hash back to the original password. This means you have three options when migrating between identity platforms:
- Import hashes directly — If the target platform can verify the same hash format
- Progressive rehashing — Authenticate against the old system, capture the password, re-hash in the new format
- Force password reset — Nuclear option, simple but terrible UX
Option 1 is cleanest but depends on hash format compatibility. Option 2 is most common in practice. Option 3 is the last resort.
Hash Format Compatibility Matrix
Before planning your migration, check what your source and target systems support:
| Source → Target | Keycloak | Auth0 | Okta | Entra ID | Cognito |
|---|---|---|---|---|---|
| bcrypt | Import ✅ | Import ✅ | No import ❌ | No import ❌ | Custom Lambda ⚠️ |
| PBKDF2 | Import ✅ | Import ✅ | No import ❌ | No import ❌ | Custom Lambda ⚠️ |
| scrypt | Custom SPI ⚠️ | Import ✅ | No import ❌ | No import ❌ | Custom Lambda ⚠️ |
| Argon2 | Import ✅ (v23+) | Custom rule ⚠️ | No import ❌ | No import ❌ | Custom Lambda ⚠️ |
| SHA-256/512 | Import ✅ | Import ✅ | No import ❌ | No import ❌ | Custom Lambda ⚠️ |
| MD5 | Import ✅ | Import ✅ | No import ❌ | No import ❌ | Custom Lambda ⚠️ |
Notice the pattern: Okta and Entra ID don’t support hash import at all. If migrating to either platform, you’re forced into progressive rehashing or password reset.
Strategy 1: Direct Hash Import
When the target platform supports your hash format, this is the fastest path.
Keycloak Hash Import
Keycloak accepts password hashes via the Admin REST API:
# Create user with pre-hashed password (bcrypt example)
curl -X POST "https://keycloak.example.com/admin/realms/myrealm/users" \
-H "Authorization: Bearer $ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"username": "jdoe",
"email": "[email protected]",
"enabled": true,
"credentials": [{
"type": "password",
"hashedSaltedValue": "$2a$12$LJ3m4yv5WGEHzNz...",
"algorithm": "bcrypt",
"hashIterations": 12
}]
}'
For bulk import, use the realm import feature with a JSON file:
{
"users": [
{
"username": "jdoe",
"email": "[email protected]",
"credentials": [{
"type": "password",
"credentialData": "{\"algorithm\":\"bcrypt\",\"hashIterations\":12}",
"secretData": "{\"value\":\"$2a$12$LJ3m4yv5WGEHzNz...\"}"
}]
}
]
}
Auth0 Hash Import
Auth0 supports bulk user import with password hashes via the Management API:
# Create import job
curl -X POST "https://YOUR_DOMAIN.auth0.com/api/v2/jobs/users-imports" \
-H "Authorization: Bearer $MGMT_TOKEN" \
-F users=@users.json \
-F connection_id="con_abc123" \
-F upsert=false
# users.json format:
[
{
"email": "[email protected]",
"custom_password_hash": {
"algorithm": "bcrypt",
"hash": {
"value": "$2b$12$LJ3m4yv5WGEHzNz..."
}
}
}
]
Auth0 supports bcrypt ($2a$, $2b$), PBKDF2, scrypt, SHA-256, SHA-512, MD5, and custom hash functions via Rules/Actions.
Strategy 2: Progressive Rehashing
When direct hash import isn’t possible (or when you want to upgrade the hash algorithm during migration), progressive rehashing is the answer.
Architecture
User Login Request
│
▼
┌──────────────────┐ ┌──────────────────┐
│ New IdP │────→│ Old IdP / DB │
│ (target) │ │ (source) │
│ │ │ │
│ 1. Check if user │ │ 3. Validate │
│ has new hash │ │ old hash │
│ 2. If not, proxy │ │ 4. Return success│
│ auth to old │ │ /failure │
│ 5. On success, │ └──────────────────┘
│ re-hash and │
│ store new hash│
│ 6. Next login │
│ uses new hash │
└──────────────────┘
Implementation Examples
Keycloak custom authenticator for progressive rehashing:
public class LegacyAuthenticator implements Authenticator {
@Override
public void authenticate(AuthenticationFlowContext context) {
UserModel user = context.getUser();
// Check if user already has a Keycloak-native password
if (user.credentialManager().isConfiguredFor(PasswordCredentialModel.TYPE)) {
context.success();
return;
}
// User doesn't have a local password — authenticate against legacy system
String password = extractPassword(context);
if (validateAgainstLegacy(user.getUsername(), password)) {
// Set the password in Keycloak (hashes with Keycloak's algorithm)
user.credentialManager().updateCredential(
UserCredentialModel.password(password)
);
context.success();
} else {
context.failure(AuthenticationFlowError.INVALID_CREDENTIALS);
}
}
private boolean validateAgainstLegacy(String username, String password) {
// Call legacy IdP or validate against imported hash
// ...
}
}
Auth0 custom database connection — Auth0 natively supports this pattern:
// Auth0 Login script (custom database)
function login(email, password, callback) {
// Try new database first
const user = await findUserInNewDB(email);
if (user && user.migrated) {
return callback(null, user);
}
// Fall back to legacy system
const legacyResult = await authenticateAgainstLegacy(email, password);
if (legacyResult.success) {
// User authenticated — migrate their password
await migrateUser(email, password, legacyResult.profile);
return callback(null, legacyResult.profile);
}
return callback(new WrongUsernameOrPasswordError(email));
}
Handling Dormant Accounts
Progressive rehashing only works for users who actually log in. After 90 days of migration, you’ll typically see:
- 70-85% of active users have been re-hashed
- 15-30% of accounts are dormant (haven’t logged in)
For dormant accounts, you have two options:
- Force password reset: Send an email requiring password reset for accounts that haven’t been re-hashed within the migration window
- Keep legacy auth: Maintain the legacy authentication path for dormant users indefinitely (not recommended — it becomes a maintenance burden)
Strategy 3: Hash Algorithm Upgrade
Even if you’re not changing platforms, upgrading your hash algorithm is a migration event.
bcrypt → Argon2id
The recommended upgrade path in 2026. Argon2id provides memory-hardness (protects against GPU attacks) that bcrypt lacks.
# Progressive bcrypt → Argon2id upgrade
import bcrypt
from argon2 import PasswordHasher
ph = PasswordHasher(
time_cost=2,
memory_cost=19456, # 19 MiB
parallelism=1,
hash_len=32,
type=argon2.Type.ID
)
def verify_and_upgrade(password: str, stored_hash: str) -> tuple[bool, str | None]:
"""Verify password and return upgraded hash if applicable."""
if stored_hash.startswith('$2'):
# bcrypt hash — verify with bcrypt
if bcrypt.checkpw(password.encode(), stored_hash.encode()):
# Rehash with Argon2id
new_hash = ph.hash(password)
return True, new_hash
return False, None
elif stored_hash.startswith('$argon2'):
# Already Argon2 — verify directly
try:
if ph.verify(stored_hash, password):
# Check if parameters need updating
if ph.check_needs_rehash(stored_hash):
return True, ph.hash(password)
return True, None
except argon2.exceptions.VerifyMismatchError:
return False, None
return False, None
Cost Factor Updates
Even without changing algorithms, increase hash cost factors periodically:
| Year | bcrypt cost | PBKDF2 iterations | Argon2id memory |
|---|---|---|---|
| 2020 | 10 | 100,000 | 37 MiB |
| 2023 | 12 | 310,000 | 19 MiB |
| 2025 | 12-13 | 600,000 | 19 MiB |
| 2026+ | 13-14 | 600,000+ | 46 MiB |
Use progressive rehashing to upgrade cost factors transparently — same technique as algorithm migration but within the same algorithm family.
Platform-Specific Export Methods
Export from Keycloak
-- Direct database query (PostgreSQL)
SELECT u.username, u.email,
c.secret_data->>'value' as password_hash,
c.credential_data->>'algorithm' as algorithm,
c.credential_data->>'hashIterations' as iterations
FROM user_entity u
JOIN credential c ON c.user_id = u.id
WHERE c.type = 'password';
Export from Auth0
# Create export job
curl -X POST "https://YOUR_DOMAIN.auth0.com/api/v2/jobs/users-exports" \
-H "Authorization: Bearer $MGMT_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"format": "json",
"fields": [
{"name": "email"},
{"name": "user_id"},
{"name": "custom_password_hash"}
]
}'
Export from LDAP/AD
# OpenLDAP — export userPassword attribute
ldapsearch -x -H ldaps://ldap.corp.local -D "cn=admin,dc=corp,dc=local" -W \
-b "ou=people,dc=corp,dc=local" \
"(objectClass=person)" uid mail userPassword
Note: Active Directory does not expose password hashes via LDAP. Use Entra Connect’s Password Hash Sync or the DSInternals PowerShell module for offline extraction from ntds.dit backups.
Migration Timeline
| Day | Activity |
|---|---|
| 1-5 | Audit source hash formats and target compatibility |
| 6-10 | Build progressive rehashing integration or bulk import scripts |
| 11-15 | Test with 100 pilot users |
| 16-20 | Deploy to production, monitor rehash progress |
| 21-90 | Progressive rehashing captures active users |
| 91-95 | Force password reset for remaining dormant accounts |
| 96-100 | Decommission legacy authentication path |
The 90-day window isn’t arbitrary — it captures approximately 3 monthly login cycles, catching users who log in infrequently. Adjust based on your actual login frequency distribution.
