✨ Introducing ContentPulse

Build Smart
Chat Agents
for Your Content

Connect your Contentstack CMS, customize your AI chat agent, and deploy intelligent conversations that understand your content. From login to live chat in minutes.

Contentstack OAuth Integration
Multiple LLM Support
NPM SDK Ready

Why Choose ContentPulse?

Join thousands of companies building the future of customer engagement with AI-powered chat agents that understand your content.

AI-Powered Chat Agents

Create intelligent chatbots that understand your content and provide instant, accurate responses to your users.

99.9% uptime
Lightning Fast Integration

Deploy your chat agent in minutes with our simple SDK. No complex setup required.

5 min setup
Enterprise Security

Advanced security features with encrypted data transmission and secure API endpoints for your content protection.

Secure by design
Advanced Analytics

Track conversations, measure engagement, and optimize your AI agents with detailed insights.

Real-time insights
Multi-Platform Support

Deploy across web, mobile, and voice platforms with our unified API and SDK ecosystem.

15+ platforms
24/7 Support

Get dedicated support from our team of AI experts. We're here to help you succeed.

24/7 support

Quick Start

Install the ContentPulse SDK and get started in seconds

terminal
$ npm install contentpulse-chat-sdk

From Complex to Simple

See how ContentPulse transforms 300+ lines of complex integration code into just 14 simple lines, while giving you more features and better performance.

ChatAgent.tsx
// Traditional approach - Building a chat agent from scratch
import { useState, useEffect, useRef, useCallback } from 'react';
import OpenAI from 'openai';
import { ContentstackAPI } from 'contentstack';
import { v4 as uuidv4 } from 'uuid';

export function ChatAgent() {
  const [messages, setMessages] = useState([]);
  const [loading, setLoading] = useState(false);
  const [openaiClient, setOpenaiClient] = useState(null);
  const [contentstackClient, setContentstackClient] = useState(null);
  const [error, setError] = useState(null);
  const [retryCount, setRetryCount] = useState(0);
  const [isStreaming, setIsStreaming] = useState(false);
  const [currentStreamMessage, setCurrentStreamMessage] = useState('');
  const abortControllerRef = useRef(null);
  const messagesEndRef = useRef(null);

  // Authentication setup
  const [isAuthenticated, setIsAuthenticated] = useState(false);
  const [apiKey, setApiKey] = useState('');
  const [deliveryToken, setDeliveryToken] = useState('');

  useEffect(() => {
    // Initialize OpenAI with error handling // [!code highlight]
    try { // [!code highlight]
      setOpenaiClient(new OpenAI({ // [!code highlight]
        apiKey: process.env.NEXT_PUBLIC_OPENAI_API_KEY, // [!code highlight]
        dangerouslyAllowBrowser: true // [!code highlight]
      })); // [!code highlight]
    } catch (err) { // [!code highlight]
      setError('Failed to initialize OpenAI client'); // [!code highlight]
    } // [!code highlight]

    // Initialize Contentstack with authentication // [!code highlight]
    try { // [!code highlight]
      setContentstackClient(ContentstackAPI.Stack({ // [!code highlight]
        api_key: process.env.NEXT_PUBLIC_CONTENTSTACK_API_KEY, // [!code highlight]
        delivery_token: process.env.NEXT_PUBLIC_DELIVERY_TOKEN, // [!code highlight]
        environment: process.env.NEXT_PUBLIC_ENVIRONMENT, // [!code highlight]
        region: 'us' // [!code highlight]
      })); // [!code highlight]
      setIsAuthenticated(true); // [!code highlight]
    } catch (err) { // [!code highlight]
      setError('Failed to initialize Contentstack client'); // [!code highlight]
    } // [!code highlight]
  }, [apiKey, deliveryToken]);

  const scrollToBottom = useCallback(() => {
    messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
  }, []);

  useEffect(() => {
    scrollToBottom();
  }, [messages, currentStreamMessage, scrollToBottom]);

  const handleRateLimit = async (retryAfter = 1000) => {
    await new Promise(resolve => setTimeout(resolve, retryAfter));
  };

  const sendMessage = async (userMessage) => {
    if (!isAuthenticated || !openaiClient || !contentstackClient) {
      setError('Please configure your API keys first');
      return;
    }

    setLoading(true);
    setError(null);
    abortControllerRef.current = new AbortController();
    
    try {
      // Add user message immediately // [!code highlight]
      const userMsg = { // [!code highlight]
        id: uuidv4(), // [!code highlight]
        role: 'user', // [!code highlight]
        content: userMessage, // [!code highlight]
        timestamp: new Date().toISOString() // [!code highlight]
      }; // [!code highlight]
      setMessages(prev => [...prev, userMsg]); // [!code highlight]

      // Fetch relevant content from Contentstack with error handling // [!code highlight]
      let content = []; // [!code highlight]
      try { // [!code highlight]
        const contentQuery = contentstackClient.ContentType('articles') // [!code highlight]
          .Query() // [!code highlight]
          .search(userMessage) // [!code highlight]
          .limit(10) // [!code highlight]
          .include(['title', 'description', 'content', 'tags']) // [!code highlight]
          .addParam('include_count', true); // [!code highlight]
        
        const response = await contentQuery.find(); // [!code highlight]
        content = response[0] || []; // [!code highlight]
      } catch (cmsError) { // [!code highlight]
        console.warn('CMS query failed, proceeding without context:', cmsError); // [!code highlight]
      } // [!code highlight]
      
      // Prepare context for OpenAI with sophisticated prompt // [!code highlight]
      const contextPrompt = content.length > 0 ? `
Based on the following content from our CMS:

${content.map((item, idx) => `
Content ${idx + 1}:
Title: ${item.title || 'Untitled'}
Description: ${item.description || 'No description'}
Content: ${item.content ? item.content.substring(0, 500) + '...' : 'No content'}
Tags: ${item.tags ? item.tags.join(', ') : 'No tags'}
---`).join('\n')}

User question: ${userMessage}

Please provide a comprehensive and helpful response based on this content. If the content doesn't directly answer the question, provide general guidance and suggest where they might find more information.` : `
User question: ${userMessage}

Please provide a helpful response. Note: No specific content was found in our knowledge base for this query.`;

      // Streaming OpenAI API call with retry logic // [!code highlight]
      let attempt = 0; // [!code highlight]
      const maxAttempts = 3; // [!code highlight]
      
      while (attempt < maxAttempts) { // [!code highlight]
        try { // [!code highlight]
          setIsStreaming(true); // [!code highlight]
          setCurrentStreamMessage(''); // [!code highlight]
          
          const stream = await openaiClient.chat.completions.create({ // [!code highlight]
            model: "gpt-4", // [!code highlight]
            messages: [ // [!code highlight]
              { // [!code highlight]
                role: "system", // [!code highlight]
                content: "You are ContentPulse AI, a helpful and knowledgeable assistant. Provide detailed, accurate responses and format them nicely with markdown when appropriate." // [!code highlight]
              }, // [!code highlight]
              { role: "user", content: contextPrompt } // [!code highlight]
            ], // [!code highlight]
            stream: true, // [!code highlight]
            max_tokens: 1500, // [!code highlight]
            temperature: 0.7, // [!code highlight]
            presence_penalty: 0.1, // [!code highlight]
            frequency_penalty: 0.1 // [!code highlight]
          }); // [!code highlight]

          let fullResponse = ''; // [!code highlight]
          
          for await (const chunk of stream) { // [!code highlight]
            if (abortControllerRef.current?.signal.aborted) { // [!code highlight]
              break; // [!code highlight]
            } // [!code highlight]
            
            const content = chunk.choices[0]?.delta?.content || ''; // [!code highlight]
            if (content) { // [!code highlight]
              fullResponse += content; // [!code highlight]
              setCurrentStreamMessage(fullResponse); // [!code highlight]
            } // [!code highlight]
          } // [!code highlight]

          // Add complete message // [!code highlight]
          const aiMsg = { // [!code highlight]
            id: uuidv4(), // [!code highlight]
            role: 'assistant', // [!code highlight]
            content: fullResponse, // [!code highlight]
            timestamp: new Date().toISOString() // [!code highlight]
          }; // [!code highlight]
          
          setMessages(prev => [...prev, aiMsg]); // [!code highlight]
          setCurrentStreamMessage(''); // [!code highlight]
          break; // Success, exit retry loop // [!code highlight]

        } catch (apiError) { // [!code highlight]
          attempt++; // [!code highlight]
          if (apiError.status === 429) { // Rate limit // [!code highlight]
            await handleRateLimit(Math.pow(2, attempt) * 1000); // Exponential backoff // [!code highlight]
          } else if (attempt >= maxAttempts) { // [!code highlight]
            throw apiError; // [!code highlight]
          } // [!code highlight]
        } // [!code highlight]
      } // [!code highlight]

    } catch (error) {
      console.error('Error:', error);
      const errorMsg = {
        id: uuidv4(),
        role: 'assistant',
        content: `Sorry, I encountered an error: ${error.message || 'Unknown error'}. Please try again.`,
        timestamp: new Date().toISOString(),
        isError: true
      };
      setMessages(prev => [...prev, errorMsg]);
      setRetryCount(prev => prev + 1);
    } finally {
      setLoading(false);
      setIsStreaming(false);
      setCurrentStreamMessage('');
    }
  };

  const clearMessages = () => {
    setMessages([]);
    setError(null);
    setRetryCount(0);
  };

  const exportChat = () => {
    const chatHistory = {
      messages,
      exportedAt: new Date().toISOString(),
      totalMessages: messages.length
    };
    
    const blob = new Blob([JSON.stringify(chatHistory, null, 2)], {
      type: 'application/json'
    });
    
    const url = URL.createObjectURL(blob);
    const a = document.createElement('a');
    a.href = url;
    a.download = `contentpulse-chat-${Date.now()}.json`;
    a.click();
    URL.revokeObjectURL(url);
  };

  return (
    <div className="chat-container">
      {!isAuthenticated && (
        <div className="auth-setup">
          <input 
            placeholder="OpenAI API Key" 
            value={apiKey} 
            onChange={(e) => setApiKey(e.target.value)}
            type="password"
          />
          <input 
            placeholder="Contentstack Delivery Token" 
            value={deliveryToken} 
            onChange={(e) => setDeliveryToken(e.target.value)}
            type="password"
          />
        </div>
      )}
      
      <div className="messages" style={{ maxHeight: '400px', overflowY: 'auto' }}>
        {messages.map((msg) => (
          <div 
            key={msg.id} 
            className={`message ${msg.role} ${msg.isError ? 'error' : ''}`}
            style={{
              padding: '12px',
              margin: '8px 0',
              borderRadius: '8px',
              backgroundColor: msg.role === 'user' ? '#e3f2fd' : '#f5f5f5',
              marginLeft: msg.role === 'user' ? '20%' : '0',
              marginRight: msg.role === 'assistant' ? '20%' : '0',
              border: msg.isError ? '1px solid #f44336' : 'none'
            }}
          >
            <strong>{msg.role === 'user' ? 'You' : 'AI Assistant'}:</strong>
            <div>{msg.content}</div>
            <small style={{ opacity: 0.7 }}>
              {new Date(msg.timestamp).toLocaleTimeString()}
            </small>
          </div>
        ))}
        
        {isStreaming && currentStreamMessage && (
          <div className="message assistant streaming">
            <strong>AI Assistant:</strong>
            <div>{currentStreamMessage}<span className="cursor">|</span></div>
          </div>
        )}
        
        <div ref={messagesEndRef} />
      </div>
      
      {error && (
        <div className="error-banner" style={{ 
          backgroundColor: '#ffebee', 
          color: '#c62828', 
          padding: '8px', 
          marginBottom: '8px',
          borderRadius: '4px',
          border: '1px solid #ef5350'
        }}>
          {error} {retryCount > 0 && `(Attempt ${retryCount + 1})`}
        </div>
      )}
      
      <div className="chat-controls" style={{ display: 'flex', gap: '8px', marginBottom: '8px' }}>
        <button onClick={clearMessages} disabled={loading}>Clear History</button>
        <button onClick={exportChat} disabled={messages.length === 0}>Export Chat</button>
        <span style={{ fontSize: '12px', color: loading ? 'orange' : 'green' }}>
          {loading ? 'Processing...' : 'Ready'}
        </span>
      </div>
      
      <ChatInput 
        onSend={sendMessage} 
        disabled={loading || !isAuthenticated} 
        placeholder={isAuthenticated ? "Type your message..." : "Please configure API keys first"}
      />
      
      <style jsx>{`
        .cursor {
          animation: blink 1s infinite;
        }
        
        @keyframes blink {
          0%, 50% { opacity: 1; }
          51%, 100% { opacity: 0; }
        }
        
        .streaming {
          border-left: 3px solid #2196f3;
        }
      `}</style>
    </div>
  );
}

// You still need to build:
// - ChatInput component with file upload support
// - Message persistence and history
// - Real-time typing indicators  
// - Voice message support
// - Multi-language support
// - Analytics and tracking
// - Custom theming system
// - Mobile app integration
// - Advanced security measures
// - Webhook integrations
// - A/B testing framework
// - Performance monitoring
// - User session management
// - Content moderation
// - GDPR compliance
// - Accessibility features
// - And dozens of other features...
ChatAgent.tsx
// With ContentPulse SDK - Simple & Powerful! ✨
import { ContentPulseProvider, SmartChatAgent } from 'contentpulse-chat-sdk';

export function ChatAgent() {
  return (
    <ContentPulseProvider
      agentId="68cc32341ae74bab9671ad5a"
      apiBaseUrl="contentpulse.dev"
    >
      <SmartChatAgent
        apiKey="contentpulse.dev-api-key"
      />
    </ContentPulseProvider>
  );
}

Documentation

Get started quickly with our comprehensive guides and API documentation.

contentpulse-chat-sdk
v1.2.1
An advanced React SDK for integrating ContentPulse AI Chat Agent with extensive customization options, beautiful themes, and seamless Contentstack CMS integration.
Weekly Downloads
346
Version
1.2.1
License
MIT
Installation
npm install contentpulse-chat-sdk
🤖
AI Agent Integration·2m ago

Successfully integrated ContentPulse AI agent