Features and Use Cases
Comprehensive analysis capabilities and monitoring tools for the ZettaQuant platform.
Core Analysis Features
SEAL Analysis Framework
Our proprietary SEAL (Sentence Evaluation and Analysis Logic) framework provides:
Financial Intelligence
- Stance Analysis: Hawkish/dovish sentiment detection
- Forward-looking Signals: Identify future-oriented statements
- Uncertainty Measurement: Quantify ambiguity and confidence levels
- Custom Metrics: Build domain-specific analysis variables
PDF Processing
- Automated text extraction and structuring
- Batch processing of document archives
- Format normalization and quality validation
- Metadata enrichment and indexing
Technical Implementation
Model Deployment
- Apply models across full datasets at scale
- Efficient processing with GPU acceleration
- Version control and model lineage tracking
- A/B testing framework for model comparison
Data Integration
- Secure Storage: All results stored in your Snowflake database
- Structured Output: Document-level metrics with sentence-level details
- Versioned Results: Reproducible analysis with full audit trail
- Custom Integration: Direct SQL access for downstream applications
Addendum
Metrics & Analysis Pipeline
Our SEAL (Sentence Evaluation and Analysis Logic) pipeline processes your data through several sophisticated steps to generate actionable insights.
Pipeline Components
1. Topic and Metric Definition
- Define relevant topics for filtering
- Create custom metric variables
- Set up classification parameters
- Configure analysis framework
2. Efficient Sentence Labeling
- Cortex-powered weak labeling
- Efficient sentence-level processing
- Custom labeling models
- Scalable analysis framework
3. Document-Level Metrics
Generate comprehensive document-level metrics including:
- Stance Analysis (hawkish/dovish)
- Forward-looking Signals
- Uncertainty Measurement
- Custom Metric Construction
4. Downstream Analysis
- Perform quantitative analysis
- Generate insights from labeled data
- Integration with your analytics workflow
- Ready for custom applications
Technical Implementation
Model Application
- Apply models across full dataset
- Efficient processing at scale
- Secure computation within Snowflake
Data Storage
- Results stored in your Snowflake DB
- Document-level metrics with labeled sentences
- Versioned and reproducible outputs
Analysis Framework
Our SCALAnalysis framework provides:
- Customizable metrics
- Flexible analysis options
- Integration with your workflows
- Reproducible results
Access Methods
- Streamlit UI for interactive analysis
- Stored procedures for programmatic access
- Direct SQL queries for custom analysis
title: Platform Overview
Platform Overview
Our Snowflake Native App consists of two main components that work seamlessly within your Snowflake environment. All data and processing remain secure within your Snowflake infrastructure.
Key Features
- Native Snowflake Integration: All data resides in your Snowflake database
- Real-Time Data Access: Near real-time data access through the native app
- Custom Metrics: Define topics and metrics tailored to your needs
- Secure Processing: All computations happen within your Snowflake environment
1. Data Selection & Ingestion
The app supports three ways to work with data:
-
Use Existing Data
- Select document metadata tables
- Choose tokenized sentences tables
- Use data from your existing database and schema
-
PDF Ingestion Tool
- Drag-and-drop interface for PDFs
- Configurable PDF parser
- Custom tokenizer selection
- Direct ingestion into Snowflake tables
-
Premium Datasets
- Real-time data updates
- Premium access to complete datasets
- Example:
- Free version available on marketplace
- Premium version with daily updates
- Complete historical data
2. SEAL Pipeline Core Module
Our core processing pipeline helps you:
-
Topic Definition
- Define relevant topics
- Filter data based on relevancy
- Custom topic modeling
-
Classification & Metrics
- Sentiment analysis
- Stance detection
- Uncertainty measurement
- Custom metric construction
-
Results Storage
- Document-level metrics
- Direct storage in your Snowflake DB
- Ready for immediate analysis
Interface Options
Users can interact with the app through:
- Streamlit UI for visual interactions
- Stored procedures for programmatic access
title: Data Ingestion
Data Ingestion
Our native app provides multiple ways to ingest and work with data, all within your Snowflake environment.
Data Sources
1. Customer's Own Data
- Upload unstructured data (PDF files, etc.)
- Document ingestion through Native App component
- Direct integration with your Snowflake DB
- Secure compute within your environment
2. Premium Datasets
- Near real-time data access
- Example:
- Latest data available only through Native App
- Daily updates
- Complete historical coverage
3. Existing Snowflake Tables
- Use your pre-existing document metadata tables
- Leverage already tokenized sentences
- Direct integration with your schema
Ingestion Process
-
Document Processing
- PDF parsing with configurable options
- Document metadata extraction
- Secure processing within Snowflake
-
Text Tokenization
- Configurable tokenizer selection
- Sentence-level segmentation
- Efficient storage in your Snowflake DB
-
Data Storage
- All processed data stays in your Snowflake environment
- Optimized table structures
- Ready for immediate analysis
Table Structures
After ingestion, your data will be organized into two main tables:
Documents Table
- Document metadata
- Source information
- Timestamps
- Document-level attributes
Sentences Table
- Tokenized sentences
- Links to parent documents
- Ready for analysis and labeling