We assist everyone in their personal and professional growth.
We provide a friendly atmosphere, stability and competitive salaries.
About the company
Established in 2007, Videal helps startups and companies in leveraging their data and workflows to get more business value by creating innovative Data mining, Big Data processing and analysis solutions. Videal's expertise and solutions help our clients to understand different aspects of customer’s behavior, decision making, processes optimization and any types of business intents.
About the project
Our client is a US based company that creates big-data powered opportunities for businesses to understand their consumer patterns and provide essential tools for boosting conversion rates with right data. The main aim of the system is creating a variety of efficient algorithms and pipelines for business intent search. The successful candidate will participate in data preparation from data engaging to data analysis, from web crawling to machine learning to create reports and API for marketing campaigns.
- Proficiency in SQL;
- Experience in Data Engineering using Spark/Java/Hadoop;
- Understanding of data modeling and data warehousing concepts;
- Hands-on experience with Java;
- Understanding the principles of parallel processing;
- Successful communication and interpersonal skills.
- ETL development background.
Nice to have
- Experience using Scala, Python, Groovy
- Working experience with AWS Cloud;
- Process orchestration software: Apache Airflow / Prefect / Dagster;
- Serverless computation frameworks.
- Analysis, Interpretation and Validation data sources.
- Writing batch & stream data processing solutions, ETL processes, automated workflows on Spark, Java and Scala.
- Testing data pipelines and data quality using modern approaches.
Why and What we expect:
- For a purpose of automation of the development operations in several revolutionary software product companies(Videal` clients), we are looking for a DevOps Master, who loves to make things work, bits and bytes flying in right directions :) and make a life of engineers and customers easier.
Support several small and mid size development teams with multiple AWS accounts and instances in different regions:
- Set up and configure EC2 Linux machines and related packages (Docker, Java, PHP, node.js, Databases, etc.)
- Set up virtual private network and secure access in AWS
- Set up monitoring and resolve incidents (out of space, out of memory, certificates expiration, etc.)
- Set up and manage products AWS (Amazon ES, EC2, ELB, S3, EMR, etc.)
- Write bash scripts and automate infrastructure routine actions
- Use (where possible) Infrastructure as a Code tools (AWS CloudFormation, Terraform, etc.)
- Set up and maintain CI/CD tools (Jenkins, etc)
AWS accounts cost optimization:
- Always think how to solve the task in the most cost-efficient way.
- Propose actions to reduce infrastructure cost for existing accounts and implement them once it's approved
- Follow all new AWS product announcements and suggest improvements to projects' infrastructures.
- Study AWS products and related technologies (monitoring tools, log analysis tools, data bases, etc) to be able to help the team with configuration and improvements
- Monitor and keep tracking on abnormal cost spikes to answer the question why it happened and what could be done to avoid that in future
- On projects with a dedicated DevOps position (full -time or part-time) timely communicate with customer in emails in English.
- In case of project infrastructure incident (going on or just resolved) participate in conf calls with customer biz or tech team.
- Send planned maintenance email notifications.
- Send emails on critical monitoring incidents' fix status (sometimes with the investigated possible reason).
- Communicate with Videal team, pm: questions regarding modules requirements specification, get product knowledgebase, organization questions
- Plan own work and maintenance time slots to avoid US business time over lapses (US safe hours 11 - 14 in summer and 10 - 13 in winter) as well as own/team member’s overtimes (normal office hours: 09-00(11-00) - 18-00(21-00), 1-2 hour for break/lunch).
- Submit daily time reports in Redmine on the work done - spent hours for each task and comments (where possible).
- Give a feedback/propositions regarding new ideas and possible improvements of product/process to Videal PM/techlead.
Videal internal(about 10% of work time):
Professional growth (technologies):
- Create/update personal growth roadmap/plan/skill set
- Perform the activities from personal growth roadmap/plans (permanently)
- Report about the results of personal growth to his mentor
Professional growth (English):
- Create/update English learning roadmap/plan/skill set
- Perform the activities from English learning roadmap/plan
- Take English tests 1 time / Quarter
- We seek for a mirror mind people who will share our mission, philosophy, values and will continue to grow along with our company for long term period. Candidate should be a process-oriented, strong logical thinking, high accuracy person with a pro-active mindset, a desire and ability of permanent professional growth, also with ability to analyze, process and classify an information and pay attention to details. The candidate should have at least intermediate speaking/writing English level with the abilities to present his own ideas and opinion, give arguments, listen and understand the requirements and needs of a client correctly during the communication.
- Perfect knowledge of Linux based OS and network protocols (iptables, ssh, HTTP, FTP, mail, LDAP, Samba, etc)
- Perfect knowledge of AWS products (Amazon ES, EC2, VPC, VPN/ssh tunnel, ELB, S3, EMR, IAM, and Security) and infrastructure
- Experience with Docker
- Experience with any of the following monitoring tools: DataDog, Nagios, CloudWatch, Zabbix, Ganglia, etc.
- Ability to write bash scripts and automate infrastructure routine actions, writing from scratch and modifying config files
Nice to have:
- Knowledge of any programming language, preferably Java.
- Experience with Elasticsearch, Kibana and Logstash.
- Experience with big data projects.
- Knowledge of any Infrastructure as a Code tools (AWS CloudFormation, Terraform, etc.)
- Experience with log analysis tools (Splunk, Papertrail, etc.)
- Good understanding of managed and not managed NoSQL databases and key-value storages
- Knowledge of Jenkins or any other CI/CD tools
- Knowledge of Azure and MS Server, IIS, MS SQL Server
- Experience of the Hadoop related stack (HDFS, YARN, HBase, Spark)
- Experience with Ansible
Conditions of work:
Our 500 m2 office is located in historical center of Kharkov (16 Korolenko str.,2 mins from metro station), has modern interior design, lot of sunshine because of a panoramic windows, 3 balconies, large comfortable kitchen/game zone and personal parking. You will get a cozy workdesk along with fast PC, 24inch monitor and high speed Internet/Wi-Fi.
- Opportunity to become a part of our DREAM TEAM, enjoy the atmosphere and help the company in creating innovative software products.
- Outstanding career and professional growth opportunities (opportunity to become a head of devops team)
- Competitive salary, personal bonuses for excellent delivery and permanent professional growth
- Paid sick leave and vacation
- Annual schedule of corporate parties including pizza, sushi, drinks, birthday celebrations, games, bbq, quests, picnics.
- Paid sport activities such as pin-pong and kicker (table soccer).
- Corporate education programs, certification and participation in corporate knowledge sharing process.
About the company
Onemata is trusted by people and businesses as their data and analytics partner for their derivative product development, marketing, sales, and recruitment teams.
We’ve harnessed the power of machine learning & data science to provide novel solutions to historic growth problems for enterprise, mid-market, and SMB’s.
• 5 years of commercial Java experience;
• strong knowledge of Java core;
• deep knowledge of SQL;
• knowledge of Hadoop related technologies (HDFS, HBase, EMR, Spark);
• knowledge of Amazon Web Services (EC2, S3, EMR);
• experience with Git;
• profiling, debugging and troubleshooting experience;
• knowledge of Linux based platforms;
• intermediate+ English;
• team player and experience of working with a US client.
Nice to have
• strong knowledge of Spring framework;
• working experience and good knowledge of Solr/Elasticsearch;
• experience with Kafka;
• experience with NoSQL/key-value databases/stores (Redis);
• experience in microservices in Kubernetes;
• a desire and ability of self-growth and self-education: new programming languages (like Kotlin and Scala), technologies, frameworks, and tools;
• strong logical thinking. Skills and experience in algorithms writing;
• ability to analyze, process, classify and structure information (big data analysis);
• high attention to details and accuracy;
• proactive approach of work in a mindset;
• patience and tolerance, calmness and sense of humor.
• write Java code (Java 8) for custom Spark flows to structure and filter big data;
• write and support RESTful microservices in Java for further deployment as Docker containers in Kubenetes cluster on AWS;
• using Kafka for all microservies interactions;
• analyze big data for further Spark processing;
• maintain and extend documentation in Asciidoc;
• perform code (Java, Scala) review on GitHub.
LA based market place selling and delivering medical cannabis with operations in CA, NV and NY that has grown as ambitious startup overcoming many technical challenges.
- 2+ years experience in testing with experience in Automation testing using Selenium;
- knowledge of Java;
- good knowledge of Selenium;
- experience in writing auto-tests for GUI and REST API;
- strong attention to details and logical thinking;
- self-motivated, self-disciplined, goal-driven;
- intermediate+ English.
Nice to have
- perform load testing, stress testing, end-to-end testing;
- easy to communicate with client's business team to clarify requirements;
- clear understanding of software development lifecycle, and process for quality assurance;
- good organizational and time management skills;
- strong problem-solving skills, proactive;
- experience in configuring project CI pipelines;
- experience in TestCafe and/or Cypress;
- experience in writing unit tests;
- experience in a startup environment preferred;
- knowledge of SQL.
- write test cases based on task description;
- support current auto tests for GUI and write new tests;
- cover back-end REST API with auto-tests;
- investigate test fails and either update tests or submit bugs in JIRA;
- daily communication with customer and dev team via slack & zoom during office hours;
- participate in CI pipelines tunning for test runs in stage and pre-prod environments;
- pro-active participation in product quality improvements;
- help manual QA to transform Full Story incidents into JIRA bugs or communitate business team it's not a bug.