Let’s create an impact together!
Making An Impact
Intellify works solving diverse business problems using cutting-edge technologies. Join us to be a part of the future!
Company Culture
- Highly fulfilling and challenging work
- Open door policy of sharing thoughts
- Support and guidance by highly experienced industry professionals
- Regular training for employees
- Flexible work timings
- A work culture that is open and casual yet responsible
OPEN POSITIONS
Senior Data Engineer (Modeler)
Job Description:
A Data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional databases. These solutions support enterprise Information Management, Business Intelligence.
Experience: 4 to 6 Years
Qualification: Bachelor’s or master’s degree in computer/data science technical or related experience.
Must have Tools Experience: SSAS, Power BI Model Building, DAX, Mongo DB, Rest API integration
Roles and responsibilities
- Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization & analytics).
- Experience having worked in Mongo DB.
- Evaluating existing data systems.
- Analyzing and translating business needs into long-term solution data models.
- Developing best practices for data coding to ensure consistency within the system.
- Reviewing modifications of existing systems for cross-compatibility.
- Evaluating implemented data systems for variances, discrepancies, and efficiency.
- Troubleshooting and optimizing data systems.
Skills And Qualifications
- Proficiency in data modeling tools.
- Experience in designing and implementing database structures
- Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required.
- Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required.
- Ability to collaborate with cross-functional teams to gather and analyze data requirements
- Knowledge of data warehousing and ETL processes
- Knowledge of Rest API integration and consumption of service
Technical Writer
Job Description:
A Technical Writer is responsible for generating innovative ideas for content while working both independently and collaboratively as part of a team.
Qualification: Bachelor’s degree in BA, MA or computer science, Computer Engineering, Information Technology or related field
Experience: 4 Years
Roles and responsibilities
- Write developer guides, tutorials, concept guides, and other documentation for topics that use APIs.
- Generate high-quality, creative, and engaging content for various projects and campaigns.
- Collaborate with content strategists and other team members to develop content briefs.
- Continuously refine and enhance content to produce compelling and original text aligned with brand tone and messaging.
Skills And Qualifications
- Experience in working with engineering team to improve, refine content, and create visuals and diagrams for technical support content.
- Experience using and documentation publishing software, such as MadCap, RoboHelp, or Wikis would be plus.
AEM Developer
- Experience with Adobe Experience Platform (AEP):
- Understanding of how AEP integrates with AEM for enhanced data management, customer profiles, and personalized content delivery.
- Cloud Services:
- Proficiency with Adobe Experience Platform Cloud services and other cloud-based infrastructures.
- Experience with cloud-native development and deployment practices.
- Analytics and Insights:
- Utilize Adobe Analytics and Adobe Target for data-driven decision-making.
- Implement and optimize analytics frameworks within AEM to track and improve user engagement and content performance.
- Adobe Data Collection is a module within AEP that brings together Adobe Analytics, Target and AEM.
- Security and Compliance:
- Ensure all AEM implementations meet security and compliance standards.
- DevOps and CI/CD:
- Experience with continuous integration and continuous deployment (CI/CD) pipelines.
- Implementing DevOps practices to streamline development, testing, and deployment processes.
- With Microsoft engagement we leverage Azure DevOps and would like to preference candidates with expertise in this tool
- Collaboration and Communication:
- Strong collaboration skills to work with cross-functional teams, including data scientists, AI specialists, and UX designers.
- Excellent communication skills to articulate technical concepts to non-technical stakeholders.
Good to have skills
- Security and Compliance
- Familiarity with data privacy regulations and practices, especially concerning user data processed through AI models.
- Data privacy is a high priority with continued development of compliance regulations that are rolling out and the growing demand for personalization from enterprises
- AI and Machine Learning Integration:
- Knowledge of Natural Language Processing (NLP) and Large Language Models (LLMs).
- Experience integrating AI models into web applications, particularly for content recommendations, personalization, and automated customer service solutions.
- Certifications
- Ideally Azure experience as we are currently focused on our Azure Data and AI certifications at Wimmer Solutions
AI/ML Engineer
Position: Prompt Engineer/ LLM Apps Developer
Job Description:
- Strong foundation in machine learning concepts and techniques
- Design and develop prompts for various applications, including text generation, translation, question answering, and creative writing, image generation
- Experience in integrating LLMs into applications, platforms, or services.
- Collaborate with product teams, data scientists, and engineers to understand user needs and translate them into effective prompts.
- The AI Engineer would complement the AEM Developer well in building out our LLM App/Web Components.
- Ensure models are developed with ethical considerations in mind, actively working to identify and mitigate biases.
- Conduct experiments and benchmarking to assess the performance of various model architectures and optimize hyperparameters.
- Maintain a meticulous approach to detail and accuracy, ensuring that LLM outputs meet the highest quality standards and adhere to coding guidelines.
- Design and implement scalable solutions for deploying LLM apps, ensuring efficiency in both computation and resource usage.
- Stay updated with the latest research and advancements in AI and NLP to continuously improve model performance.
- Work closely with UX/UI teams to ensure the integration of LLMs enhances the overall user experience.
- Experience with agile methodologies and the ability to manage projects effectively to deliver on time and within scope.
- Implement robust security and data privacy measures in all aspects of model development and deployment.
- Strong communication and teamwork skills to effectively collaborate with diverse teams and stakeholders.
- Good to have – Possess relevant Azure certifications such as Azure AI Engineer Associate, Azure Data Scientist Associate, or Azure Solutions Architect Expert
Experience
2- 3 Years in AI-ML Space. one year in prompt engineering and LLM Apps development
Must have skills
Technology Stack
- Proficient in Python, TensorFlow, PyTorch, and related libraries for model development and deployment.
- Experience in LLM Frameworks like LangChain
- Experience in fine-tuning LLM models from various sources like OpenAI/Facebook/ Google/Hugging Face etc
- Have experience in developing and consuming Vector Embeddings
- Have efficiently used context settings in LLM Apps
- Basic knowledge of databases focusing on Structured and semi structured data
- GitHub
Qualification
- Have a relevant degree such as Bachelor’s and Collage Degree in –
- Computer Science, Information Technology, Information Systems, Software Engineering, Computer Engineering
- Good to have Certification in AI
Data Solutions Architect
Job Description:
A Data Solution Architect will be to coordinate end-to-end activities associated to Data Provisioning including Collaborating closely with partners; performing risk assessment associated to data-sharing.
Qualification: Bachelors in Data Science or IT related field or equivalent professional experience.
Must have Tools Experience: Azure Synapse, Azure Data warehousing, SSIS, SSAS, Microsoft Fabric
Nice to Have Tools Experience: Big Query, AWS Red Shift, Talend
Experience: 10 to 12 Years
Roles and responsibilities
- Prepare a design for all metadata relating to various Extract/Transform/Load (ETL) processes
- Craft comprehensive data migration plans that encompass mapping data between the source and target systems, validating data quality, ensuring its integrity, and overseeing the safe and accurate transfer of data.
- Designing solutions to integrate data from various sources within the organization. This involves ensuring that data from different systems can be combined and used cohesively.
- Work with the Data Provisioning team and other partners to eliminate blockers.
- Ensure data sharing standards and requirements are met, bringing in other specialists where clarifications are required.
- Conduct troubleshooting on all ETL processes and effectively resolve any issues
- Support the Data Provisioning team in considering ways to improve processes and value our customers.
- Contribution in acquisition of data from source system to target system.
- Work on the proposals and provide with the right solution architecture based on the customer problem statement
Skills And Qualifications
- The ability to analyze and present statistical information
- Must have a broad understanding of Data Modeling, SQL and ETL, including practical experience with at least one ETL tool, such as SSIS.
- Programming experience in Python, Java, or Scala Preferred3
- Experience with database management software
- Good communication and teamwork skills.
Share your resume on hr@intellifysolutions.com