October 27, 2025
Ohio Department of Medicaid
Office of Contracts and Procurement
50 W. Town Street, Suite 400
Columbus, OH 43215
Subject: Response to Request for Information ODMR 2627-0002 — Community Engagement Compliance Verification
Dear Ohio Medicaid Team:
On behalf of The Center for Community Solutions, thank you for the opportunity to provide input in response to the Ohio Department of Medicaid’s Request for Information (ODMR 2627-0002) on Community Engagement Compliance Verification. We appreciate the Department’s initiative to seek stakeholder feedback as it evaluates potential system designs and processes to strengthen program integrity and improve efficiency in Medicaid eligibility determinations.
Our submission is provided in accordance with the RFI’s guidance and is intended solely for informational and planning purposes. We recognize that this response is non-binding and that participation in this process neither obligates ODM nor conveys any preference or advantage in future procurements.
The attached comments and recommendations reflect our review of the RFI and focus on ensuring that any future verification solution:
- Promotes equitable and accessible processes for all Medicaid enrollees.
- Strengthens privacy, confidentiality, and AI governance safeguards.
- Enhances transparency through clear data-sharing and reporting requirements.
- Provides training and support to state, county, and community partners; and
- Establishes robust reporting and analytics tools to monitor outcomes and continuous improvement.
We believe these recommendations align with ODM’s goals to improve accuracy, fairness, and operational efficiency while maintaining compliance with federal and state requirements.
We appreciate the Department’s consideration of these comments and its ongoing commitment to a verification process that supports Ohioans’ access to coverage and care. Please do not hesitate to contact us if additional clarification or discussion would be helpful.
Sincerely,
Emily Campbell,
Chief Executive Officer
Section 1.1 General Overview, Page 2
Section 1.1 explains that the Ohio Department of Medicaid is seeking information from potential vendors on how to design and implement systems that would verify Medicaid enrollees’ compliance with new federal community engagement (work) requirements. It asks for ideas on data sources, technology integration with Ohio Benefits, and methods to ensure accuracy, efficiency, and program integrity. The department’s efforts would be strengthened if it sought additional information in the following areas.
Fairness and equal access considerations
Require vendors to describe how their proposed systems will ensure that all individuals, including those with disabilities, limited English proficiency, or limited digital access, can comply without undue barriers.
User experience and accessibility
Require vendors to describe plans for user testing with Medicaid enrollees, especially people with limited digital literacy, disabilities, or limited English proficiency. Require accessible design standards and multilingual interfaces.
Consumer feedback and error correction
Require vendors to describe how they will collect and respond to consumer feedback, handle error reports, and measure the time from error identification to resolution.
Burden on community partners
Ask vendors to assess the administrative burden on nonprofits, workforce agencies, and local governments that may be asked to verify community service or training participation and propose ways to minimize or offset that burden.
Manual verification workflow
Require vendors to describe how they would handle cases that cannot be verified through data, including staffing levels, training, and human-review processes to ensure fair outcomes.
Privacy, security, and data-sharing transparency
Require vendors to describe how they will protect personal information, comply with HIPAA and state privacy laws, and limit data-sharing across agencies. Vendors should also disclose data-sharing agreements and clarify that personal data will not be shared with federal immigration authorities for non-eligibility purposes.
Error-rate monitoring and public reporting
Require vendors to describe how they will measure and report error rates, wrongful terminations, appeal outcomes, and reinstatements, including data broken down by demographic group.
Good-cause and continuous-coverage safeguards
Require vendors to describe how they would build good-cause processes, presumptive reinstatement, and coverage-continuation safeguards directly into system design.
Cost and scalability
Require vendors to include cost estimates and scalability assessments—especially for verifying community service—so the state understands the fiscal implications before committing to a large procurement.
Evidence and performance benchmarks
Request that vendors provide performance benchmarks or examples from comparable state or large-scale verification systems demonstrating value, efficiency, and effectiveness relative to cost.
Integration with local and legacy systems
Require vendors to identify challenges in integrating their solutions with Ohio Benefits, the Medicaid MIS, and county-level systems, and propose realistic timelines for implementation.
Standardized guidance, funding, and training
Vendors should develop a standardized manual outlining activities that qualify toward community engagement verification and provide or coordinate funding and training for state, county, and caseworkers to ensure consistent implementation and minimize administrative burden.
Section 3.1(A), Response Topics, Pages 4 and 5
Section 3.1(A) asks vendors to describe how they would design and operate systems to verify Medicaid enrollees’ compliance with community engagement requirements, integrate with Ohio Benefits, and ensure accuracy and integrity.
There is little detail in this section explaining how ODM arrived at the numbers it cites for who already meets the community-engagement requirement and who would require additional verification. The document lists counts and percentages but does not explain the data sources, matching process, or assumptions used to identify the 172,000 individuals who would need verification. Without clearer information about ODM’s own data methods, it is difficult for stakeholders to assess the feasibility or fairness of the proposed approach.
ODM should ask vendors how community engagement verification will align with presumptive eligibility and application intake. Because presumptive eligibility provides short-term coverage before a full eligibility determination, federal law does not permit states to condition that temporary coverage on compliance verification and doing so would undermine the purpose of presumptive eligibility. ODM should ask vendors to describe how PE cases could be coded as exempt, that verification begins only after full enrollment, and that qualified entities are not asked to perform work-requirement checks.
Additional questions arise from this section:
• How will consumer experience and rights, such as appeal processes, error correction, and language accessibility, be addressed?
• Why is there no requirement that vendors make their methods, error rates, or audit results public?
• How will ODM handle verification for activities like community service that lack electronic records?
• How will vendors prevent coverage losses caused by data mismatches or reporting errors?
• Will vendors be required to evaluate each proposed data source for accuracy, completeness, and ease of correction?
• For any use of artificial intelligence, machine learning, or natural language processing, vendors should be required to explain how these tools would be tested for accuracy, potential bias, and transparency, and how human oversight would be maintained. In addition, vendors should address confidentiality safeguards specific to AI use, making clear that any AI system employed must not rely on open-source or publicly accessible platforms and must operate within strict security and privacy guardrails. Vendors should demonstrate that all AI processing occurs within secure, compliant environments that protect personal data and uphold both privacy and confidentiality requirements under state and federal law.
ODM should refine this question to require vendors not only to list databases and technologies but also to demonstrate how they will ensure accuracy, privacy, accessibility, and fairness, particularly for individuals who cannot be verified electronically. Vendors should also explain how their systems would connect with appeal processes and allow consumers to correct errors.
Section 3.1(B), Response Topics, Pages 5 and 6
Section 3.1(B) asks vendors to describe how their proposed systems would ensure accuracy, efficiency, and integrity in verifying community engagement compliance. It invites vendors to provide information on performance measures, error tracking, and data security. While this section moves closer to operational detail, it still leaves significant gaps in areas that affect consumers directly.
The RFI does not request information on how vendors will communicate with individuals, issue notices, or assist people who have questions or face barriers using online systems. It does not ask for a plan to coordinate with existing appeal or fair-hearing processes, even though errors or data mismatches will inevitably occur. ODM should require vendors to show how they will make the verification process understandable, accessible, and timely for all participants.
ODM should ask vendors to include detailed plans for error monitoring and correction, as described in Section 1.1, with clear metrics for reinstatements and appeals.
Finally, the RFI invites vendors to discuss data security and privacy but does not ask for specific compliance frameworks or audit standards. Vendors should be required to follow established federal and state privacy protections, specify which security standards they will use, and describe how personal information will be protected from unauthorized access or secondary use.
Vendors should include specific operational safeguards to prevent wrongful disenrollment and ensure program integrity, consistent with the coverage protections described in Section 1.1.
In summary, ODM should expand Section 3.1(B) to ensure that any proposed verification system includes clear standards for communication with consumers, fair-hearing integration, ongoing performance monitoring, and strong privacy and security protections.
Section 3.1(C) (1) User Authentication: Secure login options including multi-factor authentication, Page 6
Section 3.1(C) asks respondents to describe how their proposed solution would interact with existing systems and workflows within the Ohio Department of Medicaid. This includes how new verification functions could be integrated into Ohio Benefits, the Medicaid Information System (MIS), and related data interfaces. We think this section would benefit from additional details.
How will the vendor offer multiple secure authentication options such as SMS, email, or voice verification so participants can choose what works best for them? Vendors should ensure that password recovery and account reset processes are simple, available in multiple languages, and accessible to people with disabilities. Authentication systems should be designed to avoid requiring smartphone ownership or continuous internet access, since many Medicaid participants rely on basic phones or shared devices. How will identity verification connect with existing ODM systems, such as the Ohio Benefits portal, to avoid duplicate logins or new credentials?
Section 3.1(C) (2) Document Upload and OCR: Allow recipients to upload supporting documents with automatic data extraction, Page 6
Vendors should describe how they would design a simple upload process that accepts photos or scanned images from any device, including low-resolution phone cameras. How will the vendor handle a wide range of document types and formats, and what process will be used for flagging unreadable images for human review instead of rejecting them automatically? Describe how the upload feature will be accessible to people with limited English proficiency and disabilities, including screen-reader compatibility and alternative text prompts. How will uploaded files be encrypted while in transit and at rest, with strict retention limits and automatic deletion once verification is complete?
Section 3.1 (C) (3) Notifications and Alerts: Automated reminders for recipients and alerts for caseworkers, Page 6
How would vendors enable multiple delivery channels (text, email, IVR/robocall, postal mail) so recipients can choose what works best, especially if connectivity or device type is a barrier? Send early-warning reminders to recipients (e.g., upcoming deadline, missing documents) and priority alerts to caseworkers only when recipient non-response crosses a defined risk threshold, to reduce excessive workload. These warnings should be auditable. Vendors should describe how they would create caseworker dashboards that aggregate alerts and prioritize high-risk cases rather than flooding staff with individual notifications, helping reduce “alert fatigue” and overload. How would they track and report delivery and engagement metrics (sent, delivered, replied, unresolved) so the system can be optimized and alerts refined to avoid unnecessary notifications?
Section 3.1 (C) (6) Audit Trail: Full traceability of data access and changes, Page 6
Vendors should include comprehensive audit-trail functionality that records and retains all system interactions, including data access, modifications, and decision outputs. These logs should support compliance reviews, facilitate appeals, and ensure accountability across automated and human-review processes.
Section 3.1 (C) (7) Fraud Detection: Built-in mechanisms to flag suspicious or inconsistent data, Page 6
Vendors should incorporate automated and human-reviewed fraud-detection tools capable of identifying anomalies or inconsistencies in reported work or community-engagement activities. The system should allow ODM to review and resolve potential fraud indicators without interrupting coverage for compliant participants.
Section 3.1 (C) (10) Reporting and Analytics: Dashboards and performance monitoring, Page 6
Vendors should include configurable reporting and analytics capabilities that allow ODM to monitor system performance, compliance outcomes, and error trends in real time. Dashboards should display metrics such as verification completion rates, appeal resolutions, reinstatements, and demographic breakdowns of compliance outcomes. The system should support audit-ready exports and data visualizations to facilitate transparency, oversight, and continuous improvement.
Section 3.1 (D) Describe, in general, your organization’s proposed approach to pricing models for performing services related to Community Engagement Compliance Verification, Page 6
Vendors should be asked to explain what factors would drive cost variation, including data system integration, staffing, and ongoing maintenance.
Vendors should describe how pricing would account for scalability if the number of individuals requiring verification changes significantly over time.
Vendors should identify which functions they expect ODM or other contractors to perform so the state can understand what is included in vendor pricing and what is not.
Vendors should also explain whether performance-based payments could unintentionally encourage speed over accuracy and propose safeguards against that risk.









