Sunday, 29 March 2026

Improved SQL Plan Management APIs in Oracle 26ai

Database performance is one of the most important aspects of any application. Even a small change in a SQL execution plan can significantly impact performance. To address this, Oracle AI Database 26ai introduces improvements to SQL Plan Management (SPM) APIs, making it easier for DBAs and developers to manage and stabilize SQL execution plans.

What is SQL Plan Management?

SQL Plan Management (SPM) is a feature in Oracle that helps maintain stable SQL execution performance by controlling which execution plans are used by the database.

Instead of allowing the optimizer to automatically change plans, SPM ensures that only verified and accepted plans are used.

Key benefits include:

  • Prevents unexpected performance issues
  • Maintains consistent query execution
  • Improves application stability
  • Allows controlled plan evolution

What’s New in Oracle 26ai?

Oracle 26ai improves SQL Plan Management by enhancing the APIs available in the DBMS_SPM package. These enhancements help simplify plan capture, evolution, and performance management.

The improvements focus on:

  • Better automation of SQL plan baselines
  • Improved plan evolution capabilities
  • Easier management of execution plans
  • Enhanced performance monitoring

Understanding SQL Plan Baselines

A SQL Plan Baseline is a set of accepted execution plans for a SQL statement. The optimizer will only use plans that exist in the baseline.

If a new plan appears, Oracle can test it before allowing it to replace the existing plan.

Capturing SQL Plans Automatically

Oracle 26ai improves how SQL plans are captured and stored. You can capture SQL plans directly from the cursor cache.


DECLARE
  l_plans_loaded NUMBER;
BEGIN
  l_plans_loaded := DBMS_SPM.LOAD_PLANS_FROM_CURSOR_CACHE(
    sql_id => 'your_sql_id'
  );
END;
/

This allows DBAs to quickly create baselines for important queries.

Evolving SQL Plans

Sometimes a new execution plan may perform better than the current one. Oracle allows safe evaluation of new plans before accepting them.


DECLARE
  l_report CLOB;
BEGIN
  l_report := DBMS_SPM.EVOLVE_SQL_PLAN_BASELINE(
    sql_handle => 'your_sql_handle'
  );
END;
/

This ensures that only improved plans become part of the baseline.

Why These Improvements Matter

Modern applications generate complex workloads, especially in cloud and AI-driven environments. With Oracle 26ai focusing on AI workloads and high-performance analytics, stable SQL performance is more important than ever.

Improved SQL Plan Management APIs help organizations:

  • Maintain predictable performance
  • Reduce troubleshooting time
  • Optimize critical queries
  • Improve database reliability

Best Practices for Using SQL Plan Management

  • Capture baselines for critical queries
  • Regularly monitor SQL performance
  • Allow controlled plan evolution
  • Test new execution plans before enabling them
  • Use automation provided in Oracle 26ai

Final Thoughts

The improvements to SQL Plan Management APIs in Oracle 26ai make it easier for DBAs and developers to manage execution plans efficiently. By ensuring stable query performance, these enhancements help organizations build reliable and scalable database applications.

If you are running performance-critical workloads, using SQL Plan Management in Oracle 26ai can significantly improve query stability and optimization.

Time Bucketing in Oracle 26ai: Simplifying Time-Series Data Analysis ⏱️

Modern applications generate a huge amount of time-based data such as logs, transactions, IoT sensor readings, monitoring metrics, and user activity streams. Analyzing this data efficiently requires grouping events into meaningful time intervals like minutes, hours, or days.

This is where Time Bucketing becomes extremely useful in Oracle AI Database 26ai.

What is Time Bucketing?

Time bucketing is a technique used to group timestamps into fixed time intervals (called buckets). Instead of analyzing each individual timestamp, the database groups records into time ranges.

  • Every 5 minutes
  • Every hour
  • Every day
  • Every week

Example:

Event Time Bucket
10:03 AM 10:00 – 11:00
10:25 AM 10:00 – 11:00
11:10 AM 11:00 – 12:00

This is useful in:

  • Monitoring dashboards
  • System logs
  • Financial transactions
  • IoT analytics
  • Observability systems

Why Time Bucketing is Important

  • Analyze trends over time
  • Improve query performance
  • Build dashboards and reports
  • Detect spikes or anomalies
  • Prepare data for AI or machine learning models

Time Bucketing in Oracle 26ai

Oracle 26ai improves time-series analytics by allowing developers to group timestamps into defined intervals using a time bucket function.


SELECT time_bucket('1 hour', event_time) AS bucket,
       COUNT(*) AS total_events
FROM events
GROUP BY time_bucket('1 hour', event_time)
ORDER BY bucket;

This query groups all events into 1-hour intervals and counts the number of events in each bucket.

Example 1: Website Traffic Analysis

Table for storing website visits:


CREATE TABLE website_logs (
    user_id NUMBER,
    visit_time TIMESTAMP
);

Query to analyze hourly traffic:


SELECT time_bucket('1 hour', visit_time) AS hour_bucket,
       COUNT(*) AS visits
FROM website_logs
GROUP BY hour_bucket
ORDER BY hour_bucket;

This helps identify peak traffic hours and user activity patterns.

Example 2: Monitoring Application Errors


CREATE TABLE app_logs (
    log_id NUMBER,
    log_time TIMESTAMP,
    status VARCHAR2(20)
);

Detect error spikes:


SELECT time_bucket('10 minutes', log_time) AS interval,
       COUNT(*) AS errors
FROM app_logs
WHERE status = 'ERROR'
GROUP BY interval
ORDER BY interval;

Example 3: IoT Sensor Data Analysis


CREATE TABLE sensor_data (
    device_id NUMBER,
    reading_time TIMESTAMP,
    temperature NUMBER
);

Average temperature every 30 minutes:


SELECT time_bucket('30 minutes', reading_time) AS time_interval,
       AVG(temperature) AS avg_temp
FROM sensor_data
GROUP BY time_interval
ORDER BY time_interval;

Advantages of Time Bucketing in Oracle 26ai

  • Faster analytics on large datasets
  • Simpler SQL queries
  • Better support for monitoring dashboards
  • Useful for AI and time-series workloads

Best Practices

  • Choose the right bucket size
  • Create indexes on timestamp columns
  • Use with aggregations like COUNT, AVG, SUM, MAX, MIN
  • Use in analytics and monitoring workloads

Final Thoughts

Time bucketing is a powerful feature for analyzing time-series data in Oracle 26ai. It helps convert raw event streams into meaningful insights using simple SQL queries.

If you are working with logs, metrics, or streaming data, time bucketing can significantly simplify your analytics queries.

Friday, 27 March 2026

Understanding ODB Network Service Integrations: Amazon S3, Zero-ETL, and STS

Oracle Database@AWS is a powerful multi-cloud solution that combines the performance of Oracle Exadata with the scalability of AWS infrastructure. One of the key components that enables this integration is the ODB (Oracle Database) Network, which acts as the foundation for connecting Oracle database resources with AWS services.

In this blog, we will explore how ODB Network service integrations work—specifically focusing on Amazon S3 access, Zero-ETL integration, and Security Token Service (STS)—and why they are important for modern database architectures.


What is an ODB Network?

An ODB Network is a private and isolated network created inside an AWS Availability Zone to host Oracle Cloud Infrastructure (OCI) components used by Oracle Database@AWS. It acts as the communication bridge between AWS and OCI environments and is required before provisioning Exadata infrastructure or databases.

This network allows organizations to integrate Oracle databases with AWS services while maintaining secure and private connectivity.

When creating an ODB network, AWS allows optional service integrations that enable connectivity to specific AWS services directly from the database environment.

Main integrations available:
  • Amazon S3
  • Zero-ETL Integration
  • Security Token Service (STS)

1. ODB Network Access to Amazon S3

Amazon S3 integration allows Oracle databases running in Oracle Database@AWS to interact directly with S3 storage.

When enabled, the ODB network creates a secure network path that allows customers to access their own S3 buckets from within the Oracle environment.

Key Use Cases

Self-Managed Backups

Organizations can store database backups directly in their own S3 buckets.

Data Import and Export

Database administrators can move files such as:

  • Data Pump exports
  • ETL scripts
  • Data integration files
  • External datasets

Storage for Database Operations

S3 can be used as a staging area for analytics workloads or integration pipelines.

Note: Oracle Database@AWS automatically enables network access for Oracle-managed automatic backups to S3 even if direct access is not enabled by the user.

2. Zero-ETL Integration with Amazon Redshift

One of the most exciting capabilities of Oracle Database@AWS is Zero-ETL integration.

Traditionally, data pipelines require Extract, Transform, and Load (ETL) processes to move data from transactional systems to data warehouses. These pipelines can be complex, slow, and costly.

With Zero-ETL integration:

  • Data is replicated directly from Oracle Database@AWS to Amazon Redshift
  • Real-time analytics becomes possible
  • Complex ETL pipelines are reduced or eliminated

This integration creates a dedicated network path through the ODB network, enabling replication workflows between Oracle databases and analytics platforms like Redshift.

Benefits of Zero-ETL

  • Near real-time analytics
  • Reduced operational complexity
  • Faster insights from transactional data
  • Improved data engineering productivity

3. Security Token Service (STS) Integration

Security is a critical part of any multi-cloud architecture, and this is where AWS Security Token Service (STS) plays an important role.

STS integration allows the ODB network to securely access additional AWS services using temporary credentials instead of long-term access keys.

When enabled, the ODB network can request temporary security tokens to interact with AWS services securely.

Why STS Matters

Enhanced Security

Temporary credentials reduce the risk associated with permanent access keys.

Controlled Access

Permissions can be defined using IAM policies and token-based access.

Integration with AWS KMS

If you enable STS with AWS KMS, you can use customer-managed encryption keys within authentication policies.

This is particularly useful for organizations with strict compliance and security requirements.

How These Integrations Fit into the Architecture

In Oracle Database@AWS architecture:

  • The ODB network connects Oracle Exadata infrastructure with AWS services
  • Applications running in Amazon VPCs can access Oracle databases via private connectivity
  • Service integrations allow databases to interact with AWS-native services without additional networking complexity

This approach simplifies hybrid cloud architecture while maintaining enterprise-grade performance and security.


When Should You Enable These Integrations?

Enable Amazon S3 Access When:

  • You need database backups in S3
  • You want file-based data integration
  • You use data lake storage architectures

Enable Zero-ETL When:

  • You want analytics in Amazon Redshift
  • Your organization is moving toward real-time data platforms
  • You want to eliminate ETL pipelines

Enable STS When:

  • You need secure access to AWS services
  • You follow IAM best practices
  • You require temporary credentials and stronger security control

Final Thoughts

The ODB Network is more than just a networking component—it is the backbone of Oracle Database@AWS integration with AWS services.

By enabling integrations such as Amazon S3, Zero-ETL, and STS, organizations can build powerful data architectures that combine transactional databases, analytics platforms, and secure cloud services in a seamless way.

These capabilities allow enterprises to:

  • Simplify data movement
  • Improve security
  • Enable real-time analytics
  • Integrate Oracle workloads natively with AWS

As multi-cloud adoption continues to grow, features like these make Oracle Database@AWS a compelling platform for modern data-driven applications.


References:
Oracle Documentation – ODB Network
AWS Documentation – Oracle Database@AWS Architecture
Getting Started with Oracle Database@AWS

Sunday, 22 March 2026

Can we use tnsping for Oracle Autonomous 26ai database

tnsping Not Working on macOS – Connect to Oracle 26ai Autonomous Database

tnsping Not Working on macOS (zsh) – Connecting to Oracle 26ai Autonomous Database Without It

Recently, while working on Oracle Autonomous Database 26ai, I encountered the following error on macOS:

zsh: command not found: tnsping

At first, it looked like something was missing from my Oracle client installation. However, the actual answer is that you don’t need tnsping when using SQLcl to connect to Autonomous Database.

Instead, you can connect directly using the wallet configuration.

Step 1: Set the Wallet Location

Export the wallet directory so SQLcl knows where to find the connection configuration.

export TNS_ADMIN=/Users/karandodwal/git/code/oracle/wallet

Step 2: Required Wallet Files

Your wallet directory should contain files similar to the following:

(base) karandodwal@Karans-MacBook-Air wallet % ls -ltr
total 152
-rw-r--r--@ 1 karandodwal staff 3021 22 Mar 17:33 truststore.jks
-rw-r--r--@ 1 karandodwal staff 1245 22 Mar 17:33 tnsnames.ora
-rw-r--r--@ 1 karandodwal staff 114 22 Mar 17:33 sqlnet.ora
-rw-r--r--@ 1 karandodwal staff 3030 22 Mar 17:33 README
-rw-r--r--@ 1 karandodwal staff 691 22 Mar 17:33 ojdbc.properties
-rw-r--r--@ 1 karandodwal staff 3191 22 Mar 17:33 keystore.jks
-rw-r--r--@ 1 karandodwal staff 7097 22 Mar 17:33 ewallet.pem
-rw-r--r--@ 1 karandodwal staff 6312 22 Mar 17:33 ewallet.p12
-rw-r--r--@ 1 karandodwal staff 6357 22 Mar 17:33 cwallet.sso
-rw-r--r--@ 1 karandodwal staff 25332 22 Mar 23:20 Wallet_PYDB.zip

These files are downloaded when you download the Autonomous Database Wallet from Oracle Cloud.

Step 3: Connect Using SQLcl

Once the wallet path is configured, simply run your connection command. In my case, I use an alias called atp.

(base) karandodwal@Karans-MacBook-Air ~ % atp
SQLcl: Release 24.4 Production on Sun Mar 22 23:28:13 2026
Copyright (c) 1982, 2026, Oracle. All rights reserved.

Last Successful login time: Sun Mar 22 2026 23:27:33 +05:30

Connected to:
Oracle AI Database 26ai Enterprise Edition Release 23.26.1.2.0 - Production
Version 23.26.1.2.0

Step 4: Verify the Connection

You can verify the pluggable database using:

SQL> show pdbs
CON_ID   CON_NAME                    OPEN MODE   RESTRICTED
------   --------------------------  ----------  ----------
600      RE2D3BGG7YWWTP7_PYDB       READ WRITE  NO

Key Takeaway

If you see:

zsh: command not found: tnsping

Don’t worry.

When working with Autonomous Database and SQLcl, you typically don’t need tnsping. As long as:

  • The wallet is downloaded
  • TNS_ADMIN is set correctly
  • SQLcl is installed

You can connect directly.

This approach works well on macOS using zsh and is a simple way to connect to Autonomous Database using the wallet.

Thursday, 29 May 2025

Performance Tuning with Subquery Factoring and Inline Views

Speed Up Complex Queries: Using Subquery Factoring and Inline Views in Oracle SQL

Overview: Discuss how subquery factoring (WITH) and inline views help break down complex logic, reduce repeated computations, and improve readability and performance. Touch on materialization hints if needed. Example Query:

WITH top_performers AS (
    SELECT employee_id, salary, department_id
    FROM employees
    WHERE salary > 10000
)
SELECT 
    d.department_name,
    tp.employee_id,
    tp.salary
FROM 
    top_performers tp
JOIN 
    departments d ON tp.department_id = d.department_id;

Use Case: Optimize queries that reuse subsets of data (e.g., high-salary employees) and reduce redundancy.

Querying and Analyzing JSON Data in Oracle SQL

Modern SQL: Querying JSON Data in Oracle Like a Pro

Overview: With more systems storing data in JSON format, Oracle's support for JSON functions is a game-changer. Introduce JSON_VALUE, JSON_TABLE, JSON_EXISTS, and how to use them in real scenarios. Example Query:

SQL> ed
  1  SELECT
  2      empno,
  3      ename,
  4      json_value(extra_info, '$.linkedin') AS linkedin_profile
  5  FROM
  6      emp
  7  WHERE
  8*     json_exists(extra_info, '$.linkedin');
SQL> /

Use Case: Extract social media profiles or preferences stored in a JSON column (e.g., extra_info) for analytics or personalization.

CTEs and WITH Clause for Clean, Reusable SQL

Write Cleaner SQL: Harnessing the WITH Clause (CTE) in Oracle

Overview: Benefits of CTEs: readability, reusability, complex nesting. Example Query:

SQL> ed
  1  WITH department_avg AS (
  2      SELECT deptno, AVG(sal) AS avg_salary
  3      FROM emp
  4      GROUP BY deptno
  5  )
  6  SELECT e.empno, e.sal, d.avg_salary
  7  FROM emp e
  8  JOIN department_avg d ON e.deptno = d.deptno
  9* WHERE e.sal > d.avg_salary;
SQL> /

   EMPNO     SAL                                   AVG_SALARY
________ _______ ____________________________________________
    7566    2975    2258.333333333333333333333333333333333333
    7698    2850                                         1800
    7788    3000    2258.333333333333333333333333333333333333
    7839    5000    2916.666666666666666666666666666666666667

Use Case: Identifying above-average performers.