There’s a Data Revolution Brewing
Be a Part of It

Join Data Dynamics in the #BytesToRights revolution!

We are Data Dynamics, a company etched with passion, innovation, and an unwavering commitment to build and empower in the era of AI where data is its core. Our mission: revolutionize digital trust and data democratization for enterprises. We call it the “Bytes to Rights” movement. It’s a world where everyone – from executives to everyday users commands the destiny of their data, empowering decision-makers across all echelons.
But this is no easy feat and the journey to transformation is rarely a solo act. That’s why we say: We rise together.

If you are a changemaker with a knack for going beyond the ordinary, then Data Dynamics is the place for you.

Current Openings

hidden intentionlly

Implementation Engineer – Data Management Solutions (Experience: 7-12 Years)

Location: India
Experience: 7–12 Years

We are seeking a highly technical and customer-focused Implementation Engineer to lead the deployment and configuration of our Data Dynamics Data Management platform across enterprise environments. The ideal candidate will bring strong expertise in NAS and Object Storage technologies and a mindset toward automation and intelligent data operations.

This role blends deep infrastructure knowledge with modern automation and data-intelligence practices, enabling customers to operationalize AI-driven insights around data growth, risk, and lifecycle management.

Key Responsibilities

  • Implement and configure Data Dynamics solutions across on-prem, cloud, and hybrid environments.
  • Lead end-to-end customer deployments, ensuring performance, stability, and scalability.
  • Work with NAS and Object Storage systems (NetApp, EMC, Hitachi HNAS, S3, Azure Blob).
  • Troubleshoot complex issues across CIFS, NFS, and S3 protocols.
  • Build and enhance automation for:
    • Installations & upgrades
    • Environment validation
    • Health checks and diagnostics
  • Collaborate with customers to understand requirements and tailor solutions.
  • Work closely with Product and Engineering teams to operationalize intelligent features such as:
    • Data classification
    • ROT (Redundant, Obsolete, Trivial) detection
    • Policy-driven data actions
  • Create and maintain deployment runbooks, best practices, and technical documentation.
  • Train customers and internal teams on platform usage and operations.

Required Skills & Qualifications

  • Bachelor’s degree in Computer Science, IT, or a related field.
  • Strong experience with NAS (NetApp, EMC, Hitachi HNAS) and Object Storage (S3, Azure Blob).
  • Deep knowledge of CIFS, NFS, and S3 protocols.
  • Proficiency in Linux administration and troubleshooting.
  • Hands-on experience with scripting (Shell / Python) for automation.
  • Strong problem-solving skills and ability to work independently.
  • Excellent communication and customer-facing skills.

Preferred Qualifications

  • Exposure to AI-driven data management or analytics platforms.
  • Experience automating large-scale data workflows.
  • Familiarity with REST APIs and system integrations.
  • Certifications in relevant technologies (NetApp, AWS, etc.).

 Please submit your resumes via email to dd_hr@datdyn.com.

Apply Now

Professional Services Engineer (PSE) – Data Management (Experience: 5–10 Years)

Location: Client-Aligned / Onsite / Hybrid
Experience: 5–10 Years

We are seeking a highly experienced Professional Services Engineer to act as a dedicated technical owner for key enterprise customers. This role combines deep storage expertise with data intelligence, automation, and business-facing insights.

You will operate as a trusted advisor, managing the platform end-to-end on behalf of the customer, while translating technical signals into meaningful business outcomes.

Key Responsibilities

  • Own the end-to-end deployment and operation of Data Dynamics solutions within the customer’s environment.
  • Serve as the primary technical point of contact for the client.
  • Manage and optimize NAS and Object Storage systems (NetApp, EMC, Hitachi HNAS, S3, Azure Blob).
  • Provide advanced troubleshooting across CIFS, NFS, and S3 protocols.
  • Use Zubin’s intelligence layer to:
    • Analyze data growth and risk patterns
    • Identify ROT data
    • Support governance and compliance goals
  • Design and operationalize automation for:
    • Policy enforcement
    • Data lifecycle actions
    • Reporting and audits
  • Build Power BI dashboards to convert system data into executive-level insights.
  • Proactively recommend optimizations based on usage trends and intelligent findings.
  • Create tailored documentation and operational playbooks for the customer.
  • Train and guide customer teams to maximize platform value.
  • Ensure stability, performance, and continuous improvement of the solution.

 

Required Skills & Qualifications

  • Bachelor’s degree in Computer Science, IT, or related field.
  • 9–12 years of experience with NAS and Object Storage platforms.
  • Strong knowledge of CIFS, NFS, and S3 protocols.
  • Proficiency in Linux administration and troubleshooting.
  • Strong experience with Power BI for dashboards and reporting.
  • Excellent communication, stakeholder management, and problem-solving skills.
  • Ability to work independently in dynamic, customer-owned environments.

Preferred Qualifications

  • Experience with AI-driven operational or analytics platforms.
  • Background in data governance, compliance, or enterprise data management.
  • Strong ability to interpret large datasets and present insights.
  • Scripting experience (Shell / Python) for automation.
  • Relevant certifications (NetApp, AWS, etc.).

 

Please submit your resumes via email to dd_hr@datdyn.com.

Apply Now

DevOps Architect (Experience: 10+ Years)

Job Title: DevOps Architect
Location: Pune, Maharashtra
Job Type: Full-Time

About Us
Data Dynamics is a global leader in enterprise data management, focusing on Digital Trust and Data Democracy. Trusted by over 300 organizations, including 25% of the Fortune 20, Data Dynamics is committed to creating a transparent, unified, and empowered data ecosystem. The company’s AI-powered self-service data management software revolutionizes traditional data management by empowering data creators of all skill levels to have ownership and control over their data.

Job Description:

Responsibilities:

  • DevOps Framework Design: Architect and implement a comprehensive DevOps framework to support continuous integration and continuous deployment (CI/CD) processes.
  • Pipeline Automation: Design, build, and maintain fully automated CI/CD pipelines to streamline software development and deployment.
  • Infrastructure as Code (IaC): Implement and manage infrastructure using IaC tools to ensure consistency and scalability.
  • Tool Integration: Integrate various DevOps tools and technologies to create a seamless workflow.
  • Performance Optimization: Monitor and optimize the performance of CI/CD pipelines and infrastructure
  • Security and Compliance: Ensure that all DevOps practices adhere to security and compliance standards.
  • Collaboration: Work closely with development, QA, and operations teams to ensure smooth integration and deployment of applications.
  • Documentation: Maintain comprehensive documentation of DevOps processes, tools, and configurations

 

Requirements:

  • Experience: Minimum of 10+ years of experience in DevOps, software development, or system administration, with a focus on DevOps architecture.
  • Education: Bachelor’s degree in computer science, Engineering, or a related field (or equivalent experience).
  • Technical Skills:
    • Proficiency in designing and implementing CI/CD pipelines using tools such as Jenkins, GitLab CI, CircleCI, or similar.
    • Strong knowledge of cloud platforms (AWS, Azure, GCP) and cloud-native architectures.
    • Experience with configuration management tools (Ansible, Chef, Puppet).
    • Proficiency in scripting languages (Python, Bash, PowerShell).
    • Hands-on experience with containerization and orchestration tools (Docker, Kubernetes).
    • Familiarity with monitoring and logging tools (Prometheus, Grafana, ELK stack).
    • Experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation.
    • Strong understanding of security best practices and compliance requirements.
  • Soft Skills:
    • Excellent problem-solving and analytical skills.
    • Strong communication and collaboration abilities.
    • Ability to lead and mentor a team.
    • Proactive attitude and ability to work independently.
  • Preferred Qualifications:
    • Experience in a startup environment.
    • Certification in cloud platforms (AWS Certified Solutions Architect, Google Professional Cloud Architect).
    • Knowledge of additional DevOps tools and technologies.

Please submit your resumes via email to dd_hr@datdyn.com.

Apply Now

Senior Content & Social Media Specialist (Experience: 5-7 Years)

Location: Pune, India (Hybrid)
Department: Marketing
Experience: 5–7 Years

About Data Dynamics
Data Dynamics is a high-growth software company helping organizations maximize the value of their data. With 70+ employees and a solid revenue foundation, we have established strong product-market fit and are entering a pivotal phase of global expansion.

We are looking for a Senior Content Specialist who understands that B2B marketing is not about “fluff”—it’s about authority. We need a strategic writer who can digest complex technical concepts (Data, AI, Cloud) and translate them into compelling narratives that drive business development.

The Role
You will own the Content Engine and the Social Distribution for Data Dynamics.

This is a hands-on execution role. You will not just be managing a calendar; you will be writing the whitepapers, crafting the LinkedIn strategy, and running the paid campaigns that put our brand in front of C-level decision-makers in the UK and US. You will act as the bridge between our Product/Tech teams and the Market.

Key Responsibilities

  1. The Content Engine (High-Impact Writing)
  • Translation: Deeply understand our product and vision. Translate technical features into business benefits.
  • Asset Creation: Write and manage the production of high-value collateral, including Case Studies, Whitepapers, Datasheets, and Sales Decks.
  • Sales Enablement: Your content must help Sales sell. You will build the battle cards and pitch decks that the BD team uses to close deals.
  1. Social Media Dominance (LinkedIn Focus)
  • Own the Channel: Take full ownership of the corporate LinkedIn handle. Ensure regular, high-quality posting that establishes thought leadership.
  • Newsjacking: Stay agile. Monitor industry news (AI regulations, Data breaches, Cloud trends) and proactively create timely content that inserts Data Dynamics into the conversation.
  • Visual Collaboration: Work closely with the Senior Creative Designer to ensure your words are matched with world-class visuals and videos.
  1. Paid Social & Demand Generation
  • Campaign Execution: Plan and manage Paid LinkedIn Campaigns. You know how to target specific job titles and industries to generate leads.
  • Funnel Optimization: You aren’t just chasing “likes.” You are designing lead gen funnels where social content drives traffic to landing pages that convert MQLs.
  • Analytics: Track performance metrics and optimize weekly.

 

Required Skills & Experience

  • Experience: 5–7 years in B2B Content Marketing, strictly within Technology, SaaS, or Data-driven companies.
  • The “Pen”: Exceptional writing skills. You can explain complex concepts (like Unstructured Data Management) clearly and confidently to an enterprise audience.
  • Commercial Mindset: You understand the B2B buyer journey. You know the difference between top-of-funnel educational content and bottom-of-funnel decision-making content.
  • Paid Media: Proven expertise in LinkedIn Campaign Manager. You have run paid ads and know how to manage a budget.
  • Agility: You can switch from writing a deep-dive technical blog to a punchy social post in the same hour.

Tools & Platforms

  • Social: Expert level on LinkedIn (Organic & Paid). Familiarity with X/Twitter.
  • Creation: Google Docs, PowerPoint (for decks), Notion.
  • Marketing Tech: Experience with HubSpot (or similar CRM) and Social Analytics tools.
  • Bonus: Basic understanding of SEO and Website CMS.

Why Join Data Dynamics?

  • Global Voice: Your writing will define how the company is perceived by major enterprise customers in the US, UK, UAE, India
  • Strategic Growth: You aren’t just a copywriter; you are a key part of the revenue generation engine.
  • Team: Work alongside a high-energy global marketing team that values speed, quality, and results.

Please submit your resumes via email to dd_hr@datdyn.com.

Apply Now

Senior Creative Designer (Experience: 5-7 Years)

Location: Pune, India
Department: Marketing
Experience: 5–7 Years
Reports To: Global Marketing Manager / Head of Marketing
 

About Data Dynamics
Data Dynamics is a high-growth software company helping organizations maximize the value of their data. With 70+ employees and a solid revenue foundation, we have established strong product-market fit and are entering a pivotal phase of global expansion.

We are not looking for just a graphic designer; we are looking for a Visual Storyteller. We need someone who can take complex concepts—like unstructured data management, AI, and cloud migration—and turn them into clean, compelling, and easy-to-understand visuals.

The Role
As our Senior Creative Designer, you will own the visual identity of Data Dynamics. You will work closely with our global teams in London and New York to ensure our brand looks world-class across every touchpoint.

This is a hybrid creative role. You must be comfortable switching gears between high-level video production (product explainers), digital design (social/web), and UI design (product interfaces). You will be the bridge between our technology and our customers’ understanding.

 

Key Responsibilities

  1. Visual Storytelling & Video (High Priority)
  • Product Videos: Lead the creation of engaging software product videos, including process flows, use-case walkthroughs, and “how-it-works” explainers.
  • Motion Graphics: Turn static technical diagrams into dynamic motion graphics that explain our value proposition in seconds.
  • Simplification: collaborate with Product and Marketing to visualize abstract Data and AI narratives, making the complex feel simple.
  1. Digital & Brand Design
  • Marketing Collateral: Create high-impact assets for LinkedIn campaigns, digital ads, website imagery, and email banners.
  • Internal Branding: Support the HR and Leadership teams by designing professional internal assets (Onboarding kits, Policy documents, All-Hands presentations).
  1. UI & Web Experience
  • Web Design: Act as the lead designer for the corporate website, ensuring the user journey is visually consistent and modern.
  • Product UI Support: Collaborate with the product team to design interface elements (dashboards, screens, interactions) that align with the brand aesthetic.
  1. Innovation & Standards
  • AI Adoption: Actively use and explore AI-powered design tools (for image generation, video editing, and productivity) to speed up workflows and enhance creativity.
  • Brand Guardian: Ensure strict brand consistency across all global touchpoints (US/UK/India) while continuously elevating our visual quality.

Required Skills & Experience

  • Experience: 5–7 years of hands-on design experience, strictly within B2B Technology, SaaS, or Data-driven companies. You must understand the tech landscape.
  • Video Mastery: Proven experience creating product explainer videos and process flow visuals. (Portfolio required).
  • Design Versatility: You are a “full-stack” creative with excellent skills in Digital, Motion, and UI/UX design.
  • Tech Savvy: Ability to grasp complex technical concepts (Data, AI, Cloud) and translate them into visual stories.
  • Global Collaboration: Experience working with cross-functional teams (Marketing, Product, HR) in a global environment.

Tools & Platforms

  • Design & UI: Expert proficiency in Figma (Essential for UI/Web) and Adobe Creative Suite (Photoshop, Illustrator, XD).
  • Video & Motion: proficiency in After Effects, Premiere Pro, or equivalent.
  • AI Tools: Hands-on experience with Generative AI tools (e.g., Midjourney, Firefly, AI video tools) is highly desired.

 

Why Join Data Dynamics?

  • Global Impact: Your designs will be seen by enterprise customers in New York, London, and beyond.
  • Innovation: We encourage the use of the latest AI tools to push creative boundaries.
  • Culture: Join a fast-paced, high-energy team where your work directly supports the company’s growth from.

Please submit your resumes via email to dd_hr@datdyn.com.

Apply Now

Senior C++ Developer (Experience: 12-15 Years)

Job Description:

  • Application Development: Directly involved in coding C++ applications and system debugging on both Windows and Linux platforms.
  • Requirement Analysis: Meet with technology managers to determine application requirements and develop technical specifications.
  • System Implementation: Analyze system requirements and implement development tasks.
  • Scalable Code Writing: Write scalable and efficient code for C++ software applications.
  • Performance Optimization: Implement performance optimization strategies to enhance application efficiency.
  • Code Review and Debugging: Review and debug C++ applications to ensure high-quality code.
  • Support and Collaboration: Provide support for other developers and collaborate effectively within the team.
  • Deployment: Deploy functional applications, programs, and services.
  • Documentation: Draft software and application operating procedures.

Technical Skills:

  • Strong OOP Concepts: Solid understanding of Object-Oriented Programming principles.
  • C++ Template Metaprogramming: Proficiency in C++ template metaprogramming.
  • Multithreading and Synchronization: Experience with multithreading and synchronization using C++14 semantics.
  • Thread Pool and Queuing Concepts: Knowledge of thread pool and queuing concepts.
  • Low-Level Access: Experience with low-level access to Win32 file I/O APIs.
  • Performance Optimization: Expertise in performance optimization strategies.

Preferred Skills:

  • gRPC: Experience with gRPC client and server.
  • Kafka: Knowledge of Kafka producer/consumer in C++.

Qualifications:

  • Experience: Proven experience in developing multi-threaded applications.
  • Education: Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Soft Skills: Strong problem-solving skills, excellent communication abilities, and a proactive attitude.

Job Details:

  • Work Location: Pune
  • Work Mode: Work from Office
  • Notice Period: 30 Days or less
  • Domain: Storage preferred

Please submit your resumes via email to dd_hr@datdyn.com.

Apply Now

Elasticsearch Database Architect & Designer (Experience: 8-10+ Years)

Job Summary

We are looking for a highly skilled Elasticsearch Database Architect & Designer with 8-10 years of relevant experience in designing, implementing, and optimizing Elasticsearch clusters. The ideal candidate will have a deep understanding of Elasticsearch architecture, query optimization, data indexing, and high-availability design. You will play a crucial role in ensuring scalability, performance, and security of Elasticsearch deployments while integrating them into enterprise applications and cloud environments.

Key Responsibilities

Elasticsearch Architecture & Design

  • Architect, design, and implement scalable, resilient, and high-performance Elasticsearch clusters.
  • Define indexing strategies, mapping configurations, and shard allocation to optimize search performance.
  • Ensure high availability, disaster recovery, and failover strategies for mission-critical applications.
  • Develop multi-cluster and cross-cluster search solutions for large-scale distributed environments.

 Indexing, Querying & Performance Optimization

  • Optimize Elasticsearch queries, aggregations, and filters to improve search efficiency.
  • Implement index lifecycle management (ILM) for data retention and archiving strategies.
  • Fine-tune Elasticsearch sharding, caching, and thread pool settings to maximize throughput.

  Security & Compliance

  • Implement RBAC (Role-Based Access Control), authentication, and encryption using X-Pack Security.
  • Ensure compliance with data security standards like GDPR, HIPAA, etc.

 Data Ingestion & Integration

  • Design and develop data ingestion pipelines using Logstash, Beats, and custom connectors.
  • Integrate Elasticsearch with various data sources (SQL/NoSQL databases, message queues, APIs).
  • Monitoring & Maintenance
  • Set up monitoring and alerting using Kibana, Prometheus, and Elastic Stack monitoring.
  • Perform proactive cluster health checks, backup/restore, and index management.

 Cloud & DevOps

  • Deploy and manage Elasticsearch in AWS, Azure, or GCP using Terraform, Kubernetes, or Helm.
  • Automate cluster provisioning and scaling using CI/CD pipelines and Infrastructure as Code (IaC).

 Programming & Scripting

  • Write automation scripts using Python, Bash, or Golang for cluster maintenance.
  • Integrate Elasticsearch with backend services using Java, Python, or Node.js.

Job Details:

  • Work Location: Pune
  • Work Mode: Work from Office
  • Notice Period: 30 Days or less
  • Domain: Storage preferred

Please submit your resumes via email to dd_hr@datdyn.com.

Apply Now

Data Management Enterprise Software Architect (Experience: 8+ Years)

Company Overview:

Data Dynamics, Inc. is a dynamic and innovative data management software company dedicated to providing enterprise-class data management software that addresses data compliance, governance, access and lifecycle management in a hybrid cloud environment. We are seeking a talented and experienced Software Architect to join our growing team and play a crucial role in shaping the architecture of our software solutions.

Position Overview:

As a Data Management Enterprise Software Architect at Data Dynamics Software Solutions India Pvt Ltd, you will be responsible for designing and overseeing the implementation of scalable and efficient data management systems. Your expertise in PostgreSQL and Elastic Database will be pivotal in shaping our architecture to meet the evolving needs of our clients.

 Key Responsibilities:

  1. Architectural Design:
    • Develop and communicate the overall architecture vision for our data management solutions.
    • Design scalable and performant data management systems using PostgreSQL and Elastic Database technologies.
  1. Technical Leadership:
    • Provide technical leadership to development teams, ensuring alignment with architectural best practices.
    • Mentor and guide team members on effective utilization of PostgreSQL and Elastic Database in software solutions.
  1. Collaboration:
    • Work closely with product managers, business analysts, and stakeholders to understand data management requirements.
    • Collaborate with cross-functional teams to align architectural decisions with business goals.
  1. Database Expertise:
    • Showcase expertise in PostgreSQL and Elastic Database to optimize database design and performance.
    • Evaluate and recommend database technologies and features that enhance data management capabilities.
  1. Prototyping and Proof of Concepts:
    • Create prototypes and proof of concepts to validate architectural decisions, especially related to data storage and retrieval.
    • Assess new technologies and trends to identify opportunities for innovation in data management.
  1. Documentation:
    • Document and communicate data management architectural decisions, guidelines, and best practices.
    • Create comprehensive system documentation, including database schemas, configurations, and deployment procedures.

 

Qualifications:

  • Bachelor’s or Master’s in Computer Science, Software Engineering, or a related field.
  • Proven experience as a Software Architect with a focus on data management.
  • Strong expertise in designing and implementing scalable, distributed, and high-performance data systems, particularly using PostgreSQL and Elastic Database.
  • Proficiency in SQL and database optimisation techniques.
  • Excellent communication and interpersonal skills.
  • Leadership experience and the ability to mentor and guide development teams.
  • Up-to-date knowledge of industry trends in data management.

Join us at Data Dynamics and contribute to the success of our cutting-edge data management solutions. If you are passionate about architecting enterprise data systems and possess expertise in PostgreSQL and Elastic Database, we would love to hear from you.

Job Details:

  • Work Location: Pune
  • Work Mode: Work from Office
  • Notice Period: 30 Days or less
  • Domain: Storage preferred

Please submit your resumes via email to dd_hr@datdyn.com.

Apply Now

AI/ML Developer/Architect (Experience: 7+ Years)

Company Description

Data Dynamics is a global leader in enterprise data management, focusing on Digital Trust and Data Democracy. Trusted by over 300 organizations, including 25% of the Fortune 20, Data Dynamics is committed to creating a transparent, unified, and empowered data ecosystem. The company’s AI-powered self-service data management software revolutionizes traditional data management by empowering data creators of all skill levels to have ownership and control over their data.

Job Summary:

We are looking for a highly skilled AI/ML Developer/Architect with 7+ years of experience in designing, developing, and deploying Machine Learning (ML) and Artificial Intelligence (AI) solutions. The ideal candidate should be proficient in Python, TensorFlow, PyTorch, and cloud platforms (AWS, Azure, GCP) and should have hands-on experience in building end-to-end AI/ML models, MLOps pipelines, and scalable AI architectures.

Key Responsibilities:

AI/ML Development:

  • Design, develop, and optimize ML/DL models for real-world applications across multiple industries and use cases.
  • Collaborate with data scientists, engineers, and stakeholders to define model requirements and success metrics.
  • Implement, test, and deploy AI models using frameworks like TensorFlow, PyTorch, or Scikit-learn to solve complex business problems.
  • Develop reusable model components to accelerate development and experimentation cycles.
  • Fine-tune models for accuracy, performance, and efficiency through hyperparameter optimization and architecture experimentation.
  • Perform regular model evaluations to assess bias, drift, and robustness to ensure fairness and reliability.

MLOps & Deployment:

  • Build scalable ML pipelines and deploy models using Docker, Kubernetes, and cloud services (AWS/GCP/Azure) for both batch and real-time applications.
  • Establish automated CI/CD pipelines for model versioning, testing, and deployment using MLflow, Kubeflow, or SageMaker.
  • Implement model monitoring, logging, and alerting to ensure continuous model performance and health checks post-deployment.
  • Optimize AI solutions for low-latency and high-availability performance under varying workloads.
  • Implement infrastructure as code (IaC) practices to maintain and deploy AI/ML infrastructure in a repeatable manner.
  • Work with cross-functional teams to ensure that data security, compliance, and privacy policies are integrated into MLOps pipelines.

AI Architecture & Design:

  • Architect end-to-end AI/ML solutions, including data ingestion, preprocessing, feature engineering, training, and inference pipelines.
  • Define scalable, modular, and cost-effective AI architectures that align with enterprise goals and technology stacks.
  • Design solutions to support both on-premise and cloud-based AI workflows for flexibility and scalability.
  • Create reusable design patterns for AI model integration with existing enterprise systems, APIs, and databases.
  • Implement best practices for model governance, including compliance with regulatory standards, auditability, and explainability.
  • Work with business leaders to translate strategic objectives into AI-driven initiatives and roadmaps.

Data Engineering & Processing:

  • Work with large, complex datasets to optimize ETL pipelines for AI model training and inference.
  • Design and build scalable data pipelines using distributed processing frameworks like Spark, Hadoop, or Dask.
  • Collaborate with data engineering teams to enhance data accessibility, quality, and reliability for machine learning workflows.
  • Leverage SQL/NoSQL databases and data lakes to create data schemas and structures that support efficient ML operations.
  • Implement feature stores and data cataloging tools to streamline feature reuse and data discovery across teams.
  • Develop and maintain data governance frameworks to ensure data security, privacy, and compliance.

Research & Innovation:

  • Stay updated with cutting-edge AI research and trends, including advancements in Generative AI, NLP, and Computer Vision technologies.
  • Experiment with LLMs (Large Language Models) and generative AI models (e.g., GPT, Stable Diffusion) to develop innovative AI solutions.
  • Prototype and evaluate emerging AI technologies to assess their applicability to business problems.
  • Contribute to open-source AI/ML projects, research papers, and industry conferences to establish thought leadership.
  • Collaborate with universities, research institutes, and external partners to foster innovation and access new AI capabilities.
  • Identify opportunities to apply AI in unexplored areas to create competitive advantages for the organization.

 

Required Skills & Qualifications:

Programming & AI Frameworks:

  • Proficiency in Python and key AI libraries such as TensorFlow, PyTorch, and Keras, with experience in both supervised and unsupervised learning models.
  • Experience working with computer vision libraries like OpenCV and other image/video processing frameworks.
  • Deep expertise in natural language processing (NLP) techniques using Transformers, BERT, GPT, and related models.
  • Strong understanding of ML algorithms, deep learning architectures (CNNs, RNNs, LSTMs), and optimization techniques (e.g., gradient descent, hyperparameter tuning).
  • Proficiency in at least one secondary programming language (e.g., Java, C++, or Go) to support AI integration into legacy systems.
  • Experience with tools for model evaluation, visualization, and performance monitoring such as TensorBoard and ML visualization dashboards.

MLOps & Deployment:

  • Hands-on experience with Docker for containerization and Kubernetes for orchestration of scalable AI model deployments.
  • Familiarity with web frameworks like FastAPI and Flask for serving AI models and building RESTful APIs.
  • Experience with cloud-based ML services such as AWS SageMaker, GCP Vertex AI, or Azure ML, including managing pipelines and infrastructure automation.
  • Expertise in using MLOps tools like MLflow, Kubeflow, or Argo Workflows for model tracking, lifecycle management, and version control.
  • Knowledge of serverless architecture and microservices deployment strategies to optimize cloud infrastructure costs and performance.
  • Ability to implement monitoring, logging, and auto-scaling for AI models in production environments.

Data Engineering & Processing:

  • Expertise in working with data analysis libraries such as Pandas, NumPy, and SciPy for data manipulation and exploration.
  • Strong experience with both SQL (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, DynamoDB), including data modeling for machine learning.
  • Experience with big data processing frameworks such as Spark, Hadoop, and Kafka to handle large-scale data pipelines.
  • Proficiency in feature engineering techniques, including categorical encoding, scaling, and dimensionality reduction for complex datasets.
  • Familiarity with data augmentation, synthetic data generation, and model interpretability techniques (e.g., SHAP, LIME) to enhance model robustness.
  • Understanding of data governance practices, including data lineage, security, and compliance in large organizations.

Software Development & Architecture:

  • Strong background in software engineering principles, including object-oriented programming (OOP), design patterns, and best practices.
  • Proficiency with version control tools such as Git, including experience with branching strategies, pull requests, and code reviews.
  • Experience in implementing CI/CD pipelines to automate testing, model deployment, and infrastructure updates.
  • Ability to design scalable and modular AI/ML architectures that integrate seamlessly with enterprise applications and data platforms.
  • Strong knowledge of API development, including RESTful API design, authentication, and performance optimization.
  • Familiarity with infrastructure as code (IaC) tools such as Terraform, CloudFormation, or Ansible to support scalable deployment automation.

 

Soft Skills & Others:

  • Excellent problem-solving and analytical skills.
  • Strong written and verbal communication skills.

 

Preferred Qualifications (Good to Have):

Knowledge of Generative AI, Stable Diffusion, LLMs, and multimodal AI models. Experience with edge AI and hardware-accelerated ML (NVIDIA, TPU, FPGA). Contributions to open-source AI/ML projects or research papers.

Job Details:

  • Work Location: Pune
  • Work Mode: Work from Office
  • Notice Period: 30 Days or less
  • Domain: Storage preferred

If you’re a passionate UX leader who thrives on solving complex design challenges, we’d love to hear from you! Please email your resume to dd_hr@datdyn.com

Apply Now

PostgreSQL Database Architect & Designer (Experience: 8-10 Years)

 Job Summary

We are seeking a highly experienced PostgreSQL Database Architect & Designer with 8-10 years of relevant experience to design, implement, and optimize high-performance, scalable, and secure PostgreSQL database solutions. The ideal candidate will be responsible for database architecture, high availability, query optimization, performance tuning, and database security. You will play a key role in designing robust data solutions while ensuring high availability and disaster recovery strategies.

 

Key Responsibilities

PostgreSQL Architecture & Design

  • Architect and implement scalable, resilient, and high-performance PostgreSQL databases.
  • Define and optimize database schema, indexing strategies, and partitioning techniques.
  • Design and manage multi-node PostgreSQL deployments for high availability.

Performance Tuning & Optimization

  • Optimize SQL queries, indexing, and caching to improve database performance.
  • Perform query execution plan analysis and optimization.
  • Fine-tune PostgreSQL settings (vacuum, autovacuum, connection pooling) for better resource utilization.

High Availability & Replication

  • Implement Streaming Replication, Logical Replication, and Failover mechanisms using Patroni or other HA tools.
  • Develop disaster recovery and backup strategies using pgBackRest, Barman, and PITR (Point-in-Time Recovery).

Security & Access Control

  • Implement Role-Based Access Control (RBAC), SSL/TLS encryption, and auditing.
  • Ensure compliance with GDPR, HIPAA, and other data security standards.

Monitoring & Maintenance

  • Set up database monitoring, logging, and alerting using pg_stat_statements, pgAudit, Prometheus, and Grafana.
  • Proactively conduct database health checks, performance tuning, and maintenance.

Cloud & DevOps

  • Deploy and manage PostgreSQL on AWS RDS, Aurora PostgreSQL, Azure Database for PostgreSQL, or Google Cloud SQL.
  • Automate database provisioning and scaling using Terraform, Ansible, Kubernetes, and Helm.

Scaling & Distributed Databases

  • Implement Sharding, Table Partitioning, and Read Replicas for scaling PostgreSQL.
  • Use PgBouncer, Pgpool-II for connection pooling and load balancing.

 

Advanced PostgreSQL Features

  • Work with JSONB, Full-Text Search (tsvector), and GIN/GiST indexing for advanced search capabilities.
  • Implement PostGIS for geospatial data processing.
  • Leverage pgvector for AI-driven similarity search.

Programming & Automation

  • Develop stored procedures and functions using PL/pgSQL, PL/Python, or PL/Java.
  • Automate routine tasks using Python, Bash, or Golang.

 

Job Details:

  • Work Location: Pune
  • Work Mode: Work from Office
  • Notice Period: 30 Days or less
  • Domain: Storage preferred

If you’re a passionate UX leader who thrives on solving complex design challenges, we’d love to hear from you! Please email your resume to dd_hr@datdyn.com

Apply Now

From Vision to Reality: Our Collaborative Approach

Customer-Centric

We don’t just create solutions, we craft experiences. At Data Dynamics, the customer is at the heart of everything we do. Your imagination will be fuelled by a deep understanding of their needs, empowering you to develop cutting-edge solutions that make a real impact.

Cutting-Edge Technology

We’re pioneers in the AI revolution, constantly evolving with the latest technologies and tools. Here, your ideas have the potential to become groundbreaking realities. We provide the resources and the collaborative spirit to turn your boundless imagination into tangible results.

Boundless Collaboration

Forget silos, embrace synergy! At Data Dynamics, we foster a culture of open communication and inter-team collaboration. Your brilliant colleagues become your work family, pushing each other to learn, grow, and achieve greatness together.

Impactful & Rewarding
Our work isn’t just a job, it’s a journey of continuous learning and impactful contributions. Tackle exciting challenges, collaborate with brilliant minds, and see your work directly influence the success of our business and the lives of our customers. Your career growth will be fuelled by the satisfaction of knowing you’re making a difference.

The Perks

Break the Mold

Ditch the 9-5 with flexible
hours that fit your life.

Competitive Compensation

Enjoy a competitive salary with
the opportunity to share in our
success through profit sharing.

Work from Anywhere

Enjoy the location liberation of
a hybrid work style.

Unlimited PTO

Recharge and explore with unlimited paid time off.

More Than Work, It's Family

We foster a supportive work culture with trusted leadership, open transparency & communication, and a focus on fun.

Holistic Wellbeing

Take care of your mind and body with our comprehensive health benefits, including medical health coverage.

Data Dynamics in the Spotlight

News, events, and articles

Your Guide to What’s Next

Insights, Strategies, and Trends to Keep You in the Know

Featured

Does Data Localization Alone Guarantee Privacy? The Unspoken Challenges

Featured

Visibility Into Existing Data Footprint Across All Data Constituents

Value, Risk and Potential – a Modern Approach to Managing Data

Insight Jam Podcast – A Conversation with Piyush Mehta, CEO, Data Dynamics, on Enterprise Data Management and Data Sovereignty

David Reilly: Unlocking the Value of Data with Data Dynamics' Zubin

Featured

Does Data Localization Alone Guarantee Privacy? The Unspoken Challenges

Featured

Visibility Into Existing Data Footprint Across All Data Constituents

Value, Risk and Potential – a Modern Approach to Managing Data

Insight Jam Podcast – A Conversation with Piyush Mehta, CEO, Data Dynamics, on Enterprise Data Management and Data Sovereignty

David Reilly: Unlocking the Value of Data with Data Dynamics' Zubin

Recognized By The Best!

Get the Full Story

Know More About Data Dynamics

About Us

Newsroom

Partners

All About Zubin

Resources