# Validation
DANGER
The Operator program is currently invite-only. If you're interested in running an inference node, please reach out to us at hello@inferencegrid.ai.
The Inference Grid operates on an optimistic model.
The operator optimistically returns the response and invoice to the consumer. If the consumer does not pay the invoice, their reputation score is penalized. Similarly, the consumer pays every invoice they receive, assuming the response is valid. If the operator's response is independently verified to be invalid, then their reputation score is penalized.
# Users
Operators can configure their inference nodes to require a minimum reputation score. If a user's reputation score is below this threshold, their requests will not be routed to their nodes. We recommend that operators configure their minimum reputation score to be 100.
A new user starts with a reputation score of 100 but they can increase it by performing various actions such as:
- Providing a Spark wallet with sufficient funds
- Purchasing credits for their API key
- Connecting via UMA Auth
- Providing a valid email address / phone number
All of these actions will increase their reputation score and consequently increase the number of operators that will process their requests, resulting in faster response times and allowing them to access higher quality models.
If a user consistently pays invoices, their reputation score will increase. Conversely, if a user does not pay invoices, their reputation score will decrease.
# Operators
To ensure that operators are providing quality responses, the Inference Grid implements a validation system for workers and uses it when routing requests. Each of an operator's workers will be validated independently on an ongoing basis.
For each worker, the Inference Grid may submit up to 48 test requests per day. The reputation score attached to these requests will be set arbitrarily high but the invoices will not be paid. For reconciliation purposes, your worker will be notified that an earlier request was a test query after a period of time.
These tests are used to ensure that your worker is working as claimed. For example, if your worker claims to support tool use, the Inference Grid will test this by submitting a request that requires tool use and checking that your worker is able to return the correct response.
Furthermore, the Inference Grid will compare your worker's response against responses from other workers and use it to evaluate the quality of the model being served.