On Friday, November 22nd, 2024, the annual e-Records Conference returned for the 25th time! As always, the event was co-sponsored by TSLAC and the Texas Department of Information Resources. Breaking last year’s record, this year’s conference was the largest ever with a total of 330 attendees. The number of Data Management Officers (DMOs) in attendance increased from 27 (in 2023) to 29. Continuing last year’s successful addition of a third breakout session per time slot, attendees heard speakers from state and local governments as well as vendor partners. In between sessions, attendees visited the 33 vendor tables to learn more about products and services. This year’s theme, “The Next Records Frontier,” highlighted the future of records management in a rapidly changing landscape.
The Texas Record will provide short recaps of the sessions in a series of blog posts over the next few weeks, so stay tuned! Presentation descriptions and details are available on the conference website.
AI at a Crossroads: Hope for Humanity or Existential Threat?
by Megan Carey
In order to contextualize the current sentiment and reaction toward AI, “Is it friend or foe?”, Natalie Smolenski of Hyland took e-Records attendees back (in history) to examine how society shifts and responds to new technological developments, particularly those as impactful as artificial intelligence (AI). Using the advent of the printing press and its subsequent far-spread use, Smolenski explained a pattern of revolution and backlash.
The printing press was instrumental in ushering in the Renaissance, a period of rapid economic and population growth as well as fundamental scientific, technological, and cultural innovation. It also resulted in calls for censorship and control over the new technology and the perceived threats (real or otherwise) it posed. The ability to more efficiently print text meant that access to information was at an all-time high. Those to whom information had been out of reach due to cost and social status now had unprecedented access. Concerns about what information was being circulated and the accuracy or truth of the content meant there were calls to centralize printing in order to review, approve, and otherwise regulate printing efforts through licensing. Fast forward to the 2020s—the age of artificial intelligence is upon us, and similar reactions and solutions are bandied about regarding the new technology (which isn’t as new as you’d think—if you’ve used OCR then you’ve used AI). There have been calls for government control of AI through oversight bodies, laws, and regulations in order to keep its development and growth from outpacing our understanding of its impact and influence.
While development of AI continues onward from generative AI (ChatGPT and other similar tools which create new forms of data after being trained on existing models and data), there will invariably be “AI Doomers” and “Techno-Optimists” with conflicting opinions and viewpoints on AI’s impact on humanity. Smolenski wrapped up the keynote by bringing the focus back to the human element. She posits that attempting to avoid potential catastrophe through technological regulation will not be sufficient. Instead, she believes it will be possible through the “lost art” of diplomacy, itself a creative and innovative process.
Archiving Doesn’t Cut It: Future of Records Management
by Raul Gonzalez
In “Archiving Doesn’t Cut It,” the speaker took us in depth about file-sharing and the advantages and disadvantages of three common in-place management architectures. As analysts, an issue that we’ve come to know is how easy data can proliferate. Combine the proliferation with ROT (redundant, obsolete, and transitory records), storage costs, and sensitive information, you’ll soon see the need for a strong file share foundation. Three ideas the speaker went over are “data mapping,” “piggy-back content,” and “dedicated content.” Each of these solutions can help users with accessing their data but can still have some drawbacks. For example, the “dedicated content” strategy will pull from the data source and make it accessible to users onsite. This helps with efficiency and having access to data quickly. However, this strategy only shows data taken from a certain point in time. In other words, changes to the original data source will not automatically transfer to the dedicated content.
Revolutionize Your Data Integrity: Transformative Strategies for State and Local Agencies
by Sahar Arafat-Ray
The primary focus of the presentation was to discuss strategies for implementing automated tools and how to use those tools to protect data and records as well as data integrity for an organization. Monitoring the data landscape is important for compliance and integrity of data.
One concept introduced was an “E-Records Report Card” using measures such as accuracy, completeness, conformity, integrity, timeliness, and uniqueness of data. Another concept was the migration of data from an older legacy system to a newer one (i.e., cloud software), included with a governance strategy so the data will survive the migration. When data is migrated to an automated system there need to be checks and balances to be sure that the legacy data will be maintained from beginning to end during the automation process. An organization must be sure there are no mistakes because incorrect, missing, and mismatched data can have devastating consequences for people and organizations. Once bad and/or incorrect data has been published, it is very difficult to take down and correct. Because systems are complex, multi-layered, and can break down at any stage, making sure that the data is accurate, complete, easy to access for any requests, and is regularly updated is imperative to ensure data integrity.
3… 2…1… Launch an External Research Request Process!
by Craig Kelso
Pamala Baker and Caroline Corpus-Ybarra from the Department of Family and Protective Services (DFPS) discussed their team’s ongoing efforts to provide data for large research requests from higher education institutions and public policy researchers. Part of the solution is sharing as much data as possible on the Open Data Portal, which helps to cut down on the 300-plus hours spent on average per request.
The session walked attendees through the process workflow improvements made by their team, which focused on mindful collaboration, thorough communications with researchers and other units of the agency, and documenting processes to be prepared when requests for information were received. They also discussed the tools they have deployed such as a request portal, project management platform for internal tracking, dedicated research mailboxes, and encryption software for data transfers. Their overall success was measured in terms of flexibility, structured consistent workflows, centralizing information in the portal for easier tracking, and implementing safeguards to guard sensitive data.