Summary — Reporting, Technology, and The Future
Overview and Context
This module traces the evolution of technology and reporting in the natural gas (and broader energy) industry, from the earliest mainframe computing era through the modern cloud and artificial intelligence landscape. The instructor draws on three decades of direct industry experience to illustrate how each technological era reshaped the way companies capture, store, process, and present data. The module also addresses the future of energy — the transition from carbon-based to green energy — and closes with practical career advice for students entering the industry.
The technology discussion maps directly onto the core ETRM framework studied throughout the course: every technological advancement ultimately serves the same three-part business process of Supply → Logistics → Demand, regardless of the energy type or industry.
Era 1: Mainframe Computing (1940s–1980s)
Mainframe computing refers to the large, centralized IBM computers that dominated corporate data processing from the 1940s through the 1980s. In the natural gas industry, these systems were the first mechanism for automating invoicing, trial balances, accounts receivable ledgers, and other financial reports.
Key Characteristics
- Punch cards were the primary input mechanism — physical cards with holes punched in specific columns that instructed the computer which report to run. A clerk would physically carry punch cards to a window outside the computer room and submit them.
- Managed exclusively by IT departments — business users had no direct access to the computers or the programs; all requests were routed through IT.
- All processing and report generation had to be scheduled in advance; results often took hours to return.
- If errors were found in the output (e.g., wrong accounting numbers in a trial balance), the entire job had to be resubmitted — sometimes keeping accounting staff in the office until 2:00–3:00 AM during corporate close.
Costs and Constraints
- Extremely high capital investment — hardware alone cost tens or hundreds of thousands of dollars.
- High ongoing operating costs for maintenance, staffing, and facilities (computers often occupied an entire floor of a building).
- Rigid design and development cycles lasting months or even years; by the time a requested system was delivered, the underlying business need had sometimes already changed or disappeared.
- Changes to existing systems were treated as new development projects, requiring the same lengthy approval and design process.
- Resources were scarce and shared across departments; a department requesting a new system would often be told to wait two or three years while other projects were prioritized.
- Multiple levels of management approval were required before any project could begin.
Relationship to ETRM
Despite its limitations, mainframe computing was a genuine advancement over fully manual processes (e.g., clerks typing invoices one at a time). It established the principle that structured data capture and automated reporting are foundational to running a pipeline, trading desk, or utility — a principle that remains true in modern ETRM platforms.
Era 2: Personal Computer Computing (Late 1970s–Early 1990s)
The arrival of the personal computer (PC) — exemplified by early portable models such as the Compaq laptop (colloquially described as resembling a sewing machine) and later desktop models from manufacturers such as Dell and AT&T — fundamentally shifted control from centralized IT departments to individual business users.
Key Characteristics
- Users could design, develop, and run their own programs without waiting for IT approval or scheduling.
- Early software tools included VisiCalc (the first spreadsheet application), WordPerfect (word processing), Lotus 1-2-3 (spreadsheet with macro programming), FoxPro (database development), and R&R Report Writer (reporting tool).
- Macros — small programs written within spreadsheet applications like Lotus 1-2-3 or later Microsoft Excel — allowed users to automate repetitive tasks and build lightweight custom systems.
- Microsoft eventually dominated the productivity software market with Word, Excel, PowerPoint, and Access, largely displacing earlier competitors.
- Storage media evolved from floppy disks → hard-shell disks → CDs → DVDs → cloud storage.
Benefits
- Dramatically faster development cycles — users understood their own needs and could immediately test what success looked like.
- Productivity increased significantly; users could print their own reports without submitting requests to IT.
- Individuals with programming skills (e.g., Lotus macros) could automate entire departmental workflows and become highly valuable to their organizations.
Risks and Weaknesses
- No version control — multiple copies of the same spreadsheet or program could exist simultaneously with no tracking of which was current.
- No backup or archiving — if a file was lost or the computer failed, work had to be rebuilt from scratch.
- Single-point-of-failure risk — if the one person who maintained a critical program was absent (due to illness, for example), that day's invoicing or reporting simply did not happen.
- Tremendous loss of controls — the professional governance that IT had provided disappeared, putting companies at financial and operational risk.
- The environment was "ripe for mistakes" — no audit trails, no change management, no segregation of duties.
Real-World Analogy
The personal computer era is described as going from one extreme (too restrictive — mainframe IT control) to the opposite extreme (too loose — no controls at all). Neither extreme was ideal.
Era 3: Local Area Network (LAN) Computing (Late 1980s–Mid-1990s)
The Local Area Network (LAN) restored IT governance while preserving the productivity gains of personal computing. A LAN connected individual desktops and laptops within a building or campus to a centralized hub or server managed by IT.
Key Characteristics
- IT could now perform centralized backups of all connected computers, restoring data protection and archiving.
- Users retained the ability to develop their own tools within IT-defined limits — a productive balance between the rigidity of mainframes and the chaos of standalone PCs.
- Networking allowed files and data to be shared across computers on the same network, eliminating the need for physical media transfer.
The Nike Network (Historical Anecdote)
Before LAN connectivity, transferring data between computers on different floors of the same building required a clerk to physically carry a floppy disk. Because these clerks wore athletic shoes (Nikes), the practice was dubbed the "Nike Network." The LAN made this practice obsolete by enabling electronic data transfer within the building.
Limitations
- LANs were truly local — typically confined to a single building or campus. Connecting geographically separate offices (e.g., Chicago and San Francisco) was extremely difficult.
- Remote access required dedicated telephone lines and software such as PC Anywhere, which used dial-up modems that produced a characteristic connection sound.
- Installing or updating software for clients in different cities required physical travel — a representative would fly to each location, insert a CD or DVD, and manually update the system.
- Operating as a third-party vendor serving multiple client sites across the country was "grueling" because each client's network had its own configuration, firewalls, and security rules.
Relationship to ETRM
The LAN era is significant for ETRM because it introduced the concept of a shared data environment within an organization — a precursor to the "single source of truth" that modern cloud-based ETRM platforms provide.
Era 4: Internet Computing (Mid-1990s–2010s)
Internet computing was transformative for every industry, including natural gas and energy trading. The public internet connected geographically dispersed offices, clients, and trading partners into a single, accessible network.
Key Characteristics
- Unlimited mobility — users could access company systems, execute trades, schedule gas nominations, and run reports from anywhere with an internet connection (hotels, airports, mobile devices).
- Single source of truth — rather than maintaining different software versions at different client sites, a vendor could host one system that all clients logged into simultaneously. A bug fixed once was fixed for every client.
- Software updates and patches could be deployed centrally without travel, eliminating the need to send technicians to each physical location.
- Data analytics became practical at scale because all data flowed into one central repository accessible to reporting tools from anywhere.
Business Impact — The Amazon/Sears Analogy
The instructor draws a compelling analogy: catalog retailers like Sears and JCPenney had been doing "order from home, ship to door" commerce for over 100 years. All they needed to do was digitize their catalogs for the internet. They failed to do so. Amazon (originally a bookseller) recognized the opportunity and built the dominant e-commerce platform. The lesson: always keep your eyes open for new technology and act on opportunities proactively rather than waiting to be directed.
Data as Power
A central principle introduced in this era: "He who has the data has the power." The instructor recounts developing an executive dashboard for Northern Natural Gas that translated operational and financial data into strategic insights — where the company was making money, what its cost centers were, and where margins were strongest. This presentation of data (not the underlying systems that captured it) led directly to a promotion and salary increase. The takeaway: data presentation and storytelling for executives creates more immediate career value than system development alone.
Data Security and Disaster Recovery
Internet computing introduced new responsibilities around data protection:
- Regular backups — ideally nightly, capturing the complete data set for all clients.
- Off-site storage — physical media (tapes, drives) transported to secure facilities such as Iron Mountain (a company that stores data in underground mine vaults for protection against catastrophic events).
- Geographical redundancy — maintaining a secondary data center in a different region (e.g., a primary center in Houston and a standby center near Las Vegas) so that a disaster in one location does not cause permanent data loss or extended outages.
- Multi-factor authentication and active defenses against cyber attacks.
- Regular SOC (Service Organization Control) audits to demonstrate to clients that security and backup procedures meet professional standards.
Cyber Attack Response — Real-World Example
The instructor describes an actual ransomware/hacking incident: upon detecting infiltration, the company immediately shut down the affected environment (rather than negotiating or attempting to remediate in place), rolled back to a backup from two days prior (which had been scrubbed for malware), and spun up a new clean environment within 12 hours. Normal operations resumed the next morning. The lesson: a tested disaster recovery plan is worth more than any reactive response.
Era 5: Cloud Computing (2010s–Present)
Cloud computing represents the current frontier — effectively a return to centralized, powerful computing, but now delivered as a leased service rather than owned infrastructure.
What Is the Cloud?
The cloud consists of massive supercomputers owned and operated by large technology companies (primarily Amazon Web Services (AWS), Microsoft Azure, and Oracle Cloud) that lease computing resources — processing power, storage, networking — to businesses on a pay-as-you-go or subscription basis.
Apartment Complex Analogy
Just as most individuals cannot afford to build an apartment complex but can afford to rent an apartment in one, most businesses cannot afford to purchase supercomputer-grade hardware but can afford to lease a portion of someone else's. The cloud provider builds and maintains the infrastructure; the client simply pays for the space and power they use.
Key Benefits
- Scalability overnight — doubling storage capacity or processing power that previously required purchasing, shipping, installing, and testing new hardware (a 3–6 month process) can now be accomplished with a phone call or portal request, effective the next morning.
- Leasing vs. buying — eliminates large capital expenditures on hardware; costs shift to predictable operating expenses.
- Incredible processing power made affordable to small and mid-sized businesses that could never have purchased equivalent hardware.
- Software as a Service (SaaS) — software is no longer purchased in a box at a retail store and installed from a CD/DVD. Instead, users pay an annual subscription and download or stream the application directly. Examples include Microsoft 365 (Word, Excel, PowerPoint) and Adobe Creative Cloud.
- Consumer example: When an iPhone prompts "Do you want to save to the cloud?", it is offering to store photos and files on Apple's remote supercomputers rather than consuming local device storage.
Implications for Third-Party ETRM Vendors
For companies like the instructor's firm (Rise Services), the cloud means:
- No need to purchase and ship physical servers to a hosting provider.
- No 3–6 month lead time to expand capacity.
- Hosting providers (e.g., Meriplex, a Houston-based provider with facilities also in Colorado and on the West Coast) manage the physical infrastructure and provide built-in redundancy.
- SOC audit reports from the hosting provider can be shared with clients as evidence of security compliance.
Reporting: Canned vs. Customizable (Ad Hoc)
Effective reporting is identified as one of the most undervalued components of any ETRM or business software selection process. The instructor advises: when evaluating any software system, always ask specifically about reporting capability — how easy it is to use, whether it requires programming knowledge, and whether it supports user-configurable output.
Canned Reports
- Canned reports are pre-configured, fixed-format reports built into the software.
- The data fields, layout, subtotaling, and filtering are determined at design time and cannot be changed by the end user.
- Best suited for standardized, recurring reporting (e.g., regulatory filings, standard financial statements).
- Weakness: every time a user wants a variation — a different subtotal level, an additional column, a different filter — a developer must build a new report.
Customizable (Ad Hoc) Reports
- Ad hoc reports (also called customizable reports) allow end users to select any meaningful field from the database, arrange columns, apply filters, set sort orders, and define subtotaling — all without developer involvement.
- The data universe presented to users should include meaningful business fields (customer name, pipeline, price, volume, point) rather than internal system codes that only developers understand.
- Once implemented, ad hoc reporting essentially eliminates the need for the vendor to write custom reports, freeing development resources to focus on data quality and new functionality.
Business Impact — Due Diligence Story
The instructor recounts a situation where a Chicago client was being acquired by a larger Atlanta company. During due diligence, the Atlanta team asked for detailed historical data — e.g., three years of summer gas purchases from ten specific vendors, and historical transport costs. The Chicago team was able to produce each report in two to three minutes using the ad hoc reporting tool. The Atlanta team, accustomed to such requests taking weeks, was so impressed that rather than replacing the incumbent ETRM vendor (Rise Services), the acquiring company became a new, larger client.
Embedded Third-Party Reporting Tools
- Rather than building a reporting engine from scratch, the instructor recommends leveraging embedded third-party reporting software wherever possible.
- These tools are purpose-built for reporting, are well-tested, and allow the ETRM vendor to focus resources on data quality, business logic, and domain functionality.
- "Don't reinvent the wheel."
Data Storage Best Practices
- Store data at the lowest possible denominator — if daily data exists, store it daily rather than summarizing to monthly. Summary data can always be computed from detail; detail cannot be reconstructed from summaries.
- Use a Relational Database Management System (RDBMS) — a database architecture in which related data tables are linked by keys, eliminating redundant data storage and enabling flexible querying across the entire data set.
- Prioritize recovery, recovery, recovery — data backup and disaster recovery planning are not optional; they are core business requirements.
Analytics: The New Frontier
Data analytics — the practice of analyzing stored operational and financial data to identify patterns, trends, and business insights — has emerged as a major value-add layer on top of traditional transaction capture and reporting.
- Customers increasingly want their ETRM vendor not just to capture and store data but to help them tell stories with that data: Where did we buy the most gas? What were our most expensive supply sources? Where did we achieve the best margins? What did transport cost us?
- A growing ecosystem of analytics software platforms can be connected to ETRM databases via APIs (Application Programming Interfaces), allowing third-party visualization and business intelligence tools to consume the data.
- Analytics is a natural precursor to Artificial Intelligence, which requires large, clean, well-structured data sets to function effectively.
Artificial Intelligence (AI) and the Future
Artificial Intelligence (AI) is identified as the most significant near-term disruptive force in the energy industry and beyond.
Current Applications in Natural Gas
- Pipeline inspection drones equipped with sensors that can detect leaks along thousands of miles of pipeline — a task that previously required personnel on horseback or in vehicles traversing remote terrain.
- Logistics optimization — AI can analyze flow data, market prices, storage levels, and transportation constraints to recommend more cost-effective gas movement strategies.
- Accounting accuracy — AI-assisted anomaly detection can flag unusual entries or potential errors in financial records.
- Audit support — AI tools can review large transaction data sets for compliance with contractual terms, regulatory requirements, and internal policies.
Pace of Innovation
The module traces the acceleration of the innovation cycle:
- 2,000+ years ago: major innovations (e.g., the printing press) occurred every 50–100 years.
- Industrial era: major breakthroughs every 10–20 years.
- Late 20th century: every 1–2 years.
- Today (2020s): breakthrough innovations occur every few days — new AI models are released and superseded within the same week.
Will AI Replace Human Workers?
The instructor's view is that AI will not replace human workers in the near term (at least not within the next few decades, if ever). However, professionals who understand AI and can apply it within their industry will have a significant competitive advantage over those who do not. Recommended action: use tools like ChatGPT or Perplexity to research how AI is being applied in your specific field and identify opportunities to get involved.
The Energy Future: Carbon to Green
The module closes with a discussion of the long-term energy transition.
Current State
- The global energy supply remains overwhelmingly carbon-based (natural gas, oil, coal). Green energy sources (solar, wind, hydrogen, etc.) exist but are not yet cost-competitive at the scale needed to replace fossil fuels entirely.
- The instructor frames the transition as primarily a matter of pricing and technology — not ideology. If green energy were economically competitive with natural gas on a per-BTU basis, the market would move toward it rapidly. Currently, achieving equivalent energy delivery from green sources remains cost-prohibitive for most consumers.
- A benchmark framing: if consumers were willing to pay the equivalent of $15 per gallon of gasoline for green energy, the transition could accelerate dramatically. Most are not.
Long-Term View
- Technology — including AI — will eventually drive down the cost of green energy production, storage, and distribution.
- The underlying business process (Supply → Logistics → Demand) remains unchanged regardless of energy type. ETRM systems that track units of energy can be adapted to track green energy units just as readily as natural gas units.
- Students entering the energy industry should not view the "natural gas focus" of this curriculum as limiting — the process knowledge transfers to any energy type, and indeed to any commodity industry.
Universality of the Framework
The instructor makes an explicit and important point: the supply-logistics-demand framework studied throughout this course applies to every industry — oranges, iPhones, furniture, pharmaceuticals, clothing. Students have been learning process thinking, not just natural gas. This broadens the career applicability of everything covered in the course.
Career Advice and Practical Takeaways
Throughout the lecture, the instructor weaves in career guidance relevant to students entering the energy industry:
- Proactively learn new technology — do not wait for your employer to fund or direct your learning. The instructor purchased his own computer, taught himself to code macros, and built systems for every department in his company, which launched his career trajectory.
- Leverage your coursework in interviews — let prospective employers know you understand MMBTUs, pipeline operations, storage, scheduling, and the overall supply-logistics-demand process. New hires typically take 6–12 months to become productive; students with this background can contribute sooner.
- Research companies before interviews — know their business model, office locations, current projects, and industry role (pipeline, storage, utility, marketer, etc.).
- Ask about reporting in software evaluations — it is consistently undervalued and later becomes a major pain point.
- Data presentation to executives creates more immediate career impact than system development alone.
- Stay curious about AI — regardless of your specific role or industry, understanding how AI is being applied in your field will be a differentiating competency.