Admin Workflow
Every KYC (Know Your Customer) application that flows through the system must pass a dual review before any data is submitted to regulators, exchanges, or depositories. This dual review is called the “maker-checker” process — a concept borrowed from banking operations where no single person can both initiate and approve a transaction. In our context, the “maker” is often the system itself (automatically verifying data against thresholds), and the “checker” is a human supervisor who gives the final sign-off. Think of it as the quality control station on the assembly line: the maker inspects each item against a checklist, and the checker does a final spot-check before the item ships. This page explains when applications sail through automatically, when they get flagged for human review, and what the checker is ultimately responsible for.
The workflow has three tiers: automated approval (the majority of cases), manual review by operations staff (the maker), and final sign-off by a supervisor (the checker). Let us walk through each.
Maker-Checker Flow
Section titled “Maker-Checker Flow”| Step | Role | Action | Outcome |
|---|---|---|---|
| 10 | Maker (System) | Auto-verify all checks against thresholds. If ALL pass → auto-approve. | Auto-approved (majority of cases) |
| 10 | Maker (Ops) | If any check is marginal (e.g., name partially matches), manually review flagged fields. | Approved / Escalated / Rejected |
| 11 | Checker (Supervisor) | Review maker’s decision. Final approval or rejection. Mandatory — no batch processing without it. | Checker Approved → batch pipelines fire |
| Esc | Compliance | AML (Anti-Money Laundering) HIGH risk, PEP (Politically Exposed Person) matches, sanctions hits escalated from maker. Enhanced CDD (Customer Due Diligence) required. | Approved with conditions / Rejected |
In plain English: Step 10 is where the system (or an operations team member) reviews the application. Step 11 is where a supervisor gives the final green light. Only after Step 11 does the batch pipeline start submitting data to KRAs, exchanges, and depositories.
The key to keeping the process efficient is the auto-approve criteria. The better your upstream data capture and verification, the higher the percentage of applications that pass through without human involvement.
Auto-Approve Criteria (Zero Human Touch)
Section titled “Auto-Approve Criteria (Zero Human Touch)”Application is auto-approved if ALL conditions are met:
| Check | Required Outcome |
|---|---|
| PAN (Permanent Account Number) status | Status = E (valid) |
| PAN-Aadhaar linked | Linked (= Y) |
| PAN name vs DigiLocker name | Name matches government records with high confidence |
| Penny drop name match | Bank account name verification passes |
| Face match score | Biometric face match passes |
| Liveness detection | Liveness check passes |
| AML risk | LOW risk |
| PEP match | No match |
| Sanctions match | No match |
In plain English: if the customer’s PAN is valid and linked to Aadhaar, their name matches across documents, their selfie matches their photo ID, they pass liveness detection, and they have no AML/PEP/sanctions flags, the system auto-approves without any human touching it.
When an application does not meet all the auto-approve thresholds, it gets flagged for manual review. The triggers below define exactly which checks can be marginal (and therefore resolvable by a human reviewer) versus which ones must be escalated to compliance.
Manual Review Triggers
Section titled “Manual Review Triggers”| Trigger | Condition | Action Required |
|---|---|---|
| Name mismatch | PAN vs DigiLocker name partially matches but does not pass high-confidence threshold | Verify names visually, check for initials/abbreviations |
| Penny drop marginal | Bank account name does not clearly match identity documents | Check bank name vs Aadhaar name |
| Face match marginal | Biometric face match requires manual review | Review selfie quality, request retake if needed |
| AML MEDIUM risk | Risk score above LOW threshold | Review hit details, confirm false positive or escalate |
| PEP declared | User self-declared PEP = Yes | Enhanced CDD: verify position, source of wealth |
| Income mismatch | Declared vs verified income diverge significantly | Review income proof documents |
In plain English: most manual reviews are caused by name discrepancies between documents. A partial name match between PAN and DigiLocker does not necessarily mean fraud — it often just means “Rajesh K” on one document and “Rajesh Kumar” on another.
Once the maker (whether automated or human) has made a decision, the checker performs the final review.
Checker Final Approval
Section titled “Checker Final Approval”| Checker Action | When | Result |
|---|---|---|
| Approve | Maker auto-approved or manually approved | Batch pipelines fire (KRA, CKYC, UCC, BO) |
| Reject | Fraud indicators, compliance red flags | Application rejected, client notified |
| Send Back | Missing information, unclear documentation | Returns to maker for re-review |
In plain English: the checker is the last human gate. Once they approve, the system immediately begins submitting the customer’s data to all the agencies described in the Batch Pipeline page. If they reject, the customer is notified and the application is closed. If they send it back, it returns to the maker queue for additional review.