Get Mystery Box with random crypto!

Data Science / Analytics Jobs

Logo saluran telegram datascience_jobs — Data Science / Analytics Jobs D
Logo saluran telegram datascience_jobs — Data Science / Analytics Jobs
Alamat saluran: @datascience_jobs
Kategori: Pekerjaan
Bahasa: Bahasa Indonesia
Pelanggan: 30.20K
Deskripsi dari saluran

Opportunities in Data Science & Analytics.
Rule: Contact Information must be mentioned in hiring post.
Are you recruiter, DM @iamompy to get write access to channel and post your hiring requirements easily.

Ratings & Reviews

2.33

3 reviews

Reviews can be left only by registered users. All reviews are moderated by admins.

5 stars

0

4 stars

1

3 stars

0

2 stars

1

1 stars

1


Pesan-pesan terbaru 4

2023-03-31 05:13:07 Job : GCP Data Architect- DWH
Location : Hybrid/Remote (India)
Job Type: Full time / Contract

Email your resume to career@evnek.com
Job Description as below:
• Good to have GCP Certification (Either GCP Data Engineer or GCP Cloud Architect)
• 15+ years of experience in Architecting Data projects and knowledge of multiple ETL tools (like Informatica, Talend, DataStage, etc.)
• 5+ experience in Data modeling and Data warehouse and Data lake implementation
• experience in implementing Teradata to Bigquery migration project
• Ability to identify and gather requirements to define a solution to be built and operated on GCP, perform high-level and low-level design for the GCP platform
• Capabilities to implement and provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project.
• GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization.
• Databases including Big table, Cloud SQL, Cloud Spanner, Memory store, Data Analytics Data Flow, DataProc, Cloud Pub/Sub, Kubernetes, Docker, managing containers, container auto scaling and container security
• GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization.
• Databases including Big table, Cloud SQL, Cloud Spanner, Memory store, Data Analytics Data Flow, DataProc, Cloud Pub/Sub, Kubernetes, Docker, managing containers, container auto scaling and container security
• Experience in Design, Deployment, configuration and Integration of application infrastructure resources including GKE clusters, Anthos, APIGEE and DevOps Platform
• Application development concepts and technologies (e.g. CI/CD, Java, Python)
• Capabilities to implement and provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project.
18.5K viewsO.P., 02:13
Buka / Bagaimana
2023-03-31 05:12:30 Job Description: Engineering Manager- AI/ML
● Responsible for overall project and the team management
● Designing framework for implementation of AI/ML for IoT applications
● Data Analytics and data visualization for IoT applications
● Development of Secure, Anonymity-Preserving, Networked, Artificial Intelligence
Minimum Qualifications and Experience:
● PhD in any of the following disciplines Electrical/Computer Science or relevant
engineering discipline with focus on AI/ML aspects with at least 2 years of work
experience with industry.

OR

● M.E./MTech in any of the following disciplines Electrical/Computer Science or relevant
engineering discipline with focus on AI/ML aspects and 6 years experience in industry with
applied AI/ML aspects.

Required Expertise:
● Hands on experience in building and leading teams delining with digital transformation:
AI/ML, IoT, device engineering, cloud
● Experience in development and validation of AI/ML frameworks.
● Industrial applications of AI/ML algorithms.
● Experience in programming languages like Python, R, Java, C++
● Knowledge of cloud computing, fog computing and edge computing
● Experience in machine learning, deep learning algorithms
Other terms:
● The positions are contractual and full time in nature.
● The initial duration of the appointment is for 1 year which is extendable based on
performance evaluated annually.
Location of work:
TIH-IoT, IIT Bombay Campus, Powai, Mumbai 400076

Apply at: https://docs.google.com/forms/d/e/1FAIpQLSecQAN_An9Wk-uuTyfSm3iDGstLU-ur3nSTevZe3K-P6DUEPA/viewform?usp=sf_link
13.5K viewsO.P., 02:12
Buka / Bagaimana
2023-03-31 05:12:30 Job Description: Principal Engineer-AI / ML - 25 Lpa
● Development of AI/ML algorithms and architectures for application fields using IoT
(agriculture/health/industrial/security)
● Focus on data analysis, and applicability of AI/ML in the applications of interest.
● Experience with model development, data preparation, training and inference ready
deployment of models.
● End to end integration of Data and AI/ML platform
Minimum Qualifications and Experience:
● PhD in any of the following disciplines Electrical/Electronics & Communication/Computer
Science/ or relevant engineering discipline with focus on Artificial Intelligence/Machine
Learning (AI/ML) aspects
OR
● ME/Mtech with 4 years of work experience OR Btech with 6 years of work experience in
any of the following disciplines Electrical / Electronics & Communication/Computer
Science or relevant engineering discipline with focus on Artificial Intelligence / Machine
Learning (AI/ML) aspects .
● Required Expertise:
● Experience in proposing models/frameworks/approaches for using AI/ML to solve
problems.
● Development and validation of AI/ML frameworks. Familiarity with existing frameworks.
● Industrial applications of AI/ML algorithms.
● Translation of AI/ML algorithms that run on PC like platforms to Edge computing
platforms.
● Data analysis and exploration skills.
● Experience on deep learning and machine learning algorithms

● Experience API/SDK development for accessing vision, speech, language and decision
making AI/ML models.
Other terms:
● The positions are contractual and full time in nature.
● The initial duration of the appointment is for 1 year which is extendable based on
performance evaluated annually.
Location of work:
● TIH-IoT, IIT Bombay Campus, Powai, Mumbai 400076

Apply at: https://docs.google.com/forms/d/e/1FAIpQLSecQAN_An9Wk-uuTyfSm3iDGstLU-ur3nSTevZe3K-P6DUEPA/viewform?usp=sf_link
10.2K viewsO.P., 02:12
Buka / Bagaimana
2023-03-28 16:24:27 Sql practice session : https://www.linkedin.com/posts/bansal-sakshi_google-forms-easily-create-and-analyze-activity-7045007671090921472-jm7f?utm_source=share&utm_medium=member_ios
10.1K viewsSakshi Bansal, 13:24
Buka / Bagaimana
2023-03-27 07:19:22 Sql practice session : https://www.linkedin.com/posts/bansal-sakshi_google-forms-easily-create-and-analyze-activity-7045007671090921472-jm7f?utm_source=share&utm_medium=member_ios
11.0K viewsSakshi Bansal, 04:19
Buka / Bagaimana
2023-03-23 12:06:45 Microsoft Certified: Azure Data Scientist
This position is full remote, salary in US Dollar and urgent.

Description:
As an Azure Data Science Associate, you will embed deeply within Data Science developing data science products. You will constantly engage with clients and cover new domains, building deep expertise with in-house tools. You will be expected to apply high standards to building predictive models and identify high-impact areas. As Data Science greatly relies on the value of data, you’ll be expected to use trends from a number of credible sources to add value to your work. You will work closely with many remarkable people at team including but not limited to our engineering, development, marketing, and social media teams.

Responsibilities:
Set up and manage data sources, experiments, and training scripts in Azure workspaces
Determine appropriate compute instances for a training workload
Create and manage experiments with Azure SDK
Manage different data types including structured, semi-structured, and unstructured
Build and maintain data flows and pipelines
Extract, process, filter, and present large data quantities in the Azure ecosystem
Define and build metrics, perform business analysis, and quantify decisions through the utilization of data
Ability to communicate technical concepts and solutions at a level appropriate for technical and non-technical audiences.

Minimum Qualifications:
An undergraduate degree in Statistics, Computer Science, Business Analysis, Economics or a related field
Certified by Microsoft as an Azure Data Scientist Associate or Azure Data Engineer Associate
Proven experience with, and applied knowledge of, at least one scripting language, such as SQL, Python, or R
Working knowledge of different types of data (structured, semi-structured, and unstructured)
Substantial coursework and practical experience in data management, data warehousing, and statistical inference
Working knowledge of big data concepts like Hadoop, MapReduce, and HDFS
Excellent written and verbal communication
Willingness to work as an individual and in a team as per the need of a project
Overlap of 4 hours with UTC-8:00 America/Los_Angeles

Apply using this link: https://admin.goboon.co/job-post/1n6qG8m5fiIXAeugAW
All the selection process is conducted through turing's platform (Boon App).

After your aplication, in a few days our Partnership Manager Leonardo Oliveira will be contacting you. He will guide you through the next steps.

If you have any question, please let me Know:
https://www.linkedin.com/in/patriciasilvabarbosa/
14.2K viewsO.P., 09:06
Buka / Bagaimana
2023-03-23 12:06:45 Analytics Manager
This position is full remote / salary in US Dollar.

Description:
A U.S.-based company that is tapping into advanced technologies to provide state-of-the-art property management services, is looking for an Analytics Manager. The selected candidate will be responsible for detecting historical business trends and projecting future financial performance by combining and cleansing the company data. The company's platform is offering customers cutting-edge acquisition, renovation, leasing, management, maintenance, and brokerage services. This position requires 4+ hours of overlap with the PST time zone and will be a full-time, long-term position.

Job Responsibilities:
Using data and analytics tools like Tableau, Alteryx, SQL, Python, Excel, etc., deliver strategic insights
Lead and manage performance measurement, and operational reporting, specifically tracking performance, KPI, ROI, and operational trends
To executives, operators, and outside parties, present the findings of the analyses and offer recommendations
Create unit reporting for the business, including assessments of new, lost, existing, and acquisition data
Collaborate with company leaders to direct the organization's use of data, analytics, and reporting to drive business strategy, leading to more automation and improved internal customer experience

Job Requirements:
Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)
At least 5+ years of relevant experience as a data analyst
At least 5+ years of professional work experience using analytical tools or programming languages
Demonstrable experience working with Tableau, Alteryx, MySQL, Python, SQL, etc.
Solid understanding of traditional KPIs associated with the property management or hospitality industries
Preference for people who have worked as advisors for a premier public accounting or strategy firm: (transaction experience, business strategy, etc.)
Excellent written and verbal communication skills are required to explain company procedures, concepts, and difficulties to C-level executives
Must have a commitment to excellence
Must be detail oriented with exceptional judgment and decision-making abilities
Excellent relationship-building and problem-solving skills
Fluent in English communication

Apply using this link: https://admin.goboon.co/job-post/1n6rarMhB79mvCcdNM
All the selection process is conducted through turing's platform (Boon App).

After your aplication, in a few days our Partnership Manager Leonardo Oliveira will be contacting you. He will guide you through the next steps.

If you have any question, please let me Know:
https://www.linkedin.com/in/patriciasilvabarbosa/
10.2K viewsO.P., 09:06
Buka / Bagaimana
2023-03-23 12:06:45 Sr. Data Engineer
This position is full remote, salary in US Dollar and urgent.

Description
Location: candidates based in Europe, Africa or LATAM (except from Brazil)

A U.S.-based company is looking for a Sr. Data Engineer. The engineer will be taking responsibility for the entire process of creating and implementing your work. The company is enabling like-minded individuals to connect with each other by joining growing communities on the platform. This position requires an extensive overlap with the EST time zone (11AM - 3PM).

Job Responsibilities:
Examine the platform in detail to find areas for improvement
Utilize sophisticated mathematics to assist in resolving issues for our clients
Develop production-ready, applicable machine-learning models
Use A/B testing and experimentation to iterate and fine-tune algorithms and/or models
Investigate cutting-edge algorithmic methods
Recognize innovative ideas that have potential and adapt them to the community and platform
Create, put into action, and maintain highly dependable distributed systems

Job Requirements:
Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)
At least 5+ years of relevant experience as a Data Engineer
Extensive experience working with Flink and Kubernetes
Strong skills in either Python, Java or Scala
Prolific visualization and basic data analysis skills
Familiarity with SQL is desirable
Experience or background with Applied ML/algorithm optimization
Proficient experience with programming languages and statistical analysis (Python, SQL)
5+ years experience in a quantitative/modeling or highly scalable computing environment
Nice to have some Ads experience
Nice to have Airflow
Excellent English communication skills, spoken and written

Other Mandatory requirements:
Recent experience in skills is critical.

Scale Requirements:
More than 1M+ users
At least 10K ratings with average rating above 3.5 / 4.0 (4.0+ preferred)
Scale Requirements: Worked in large orgs / teams (e.g., Fortune 500 or Series D+ startups
Worked directly with PMs (example)
Know how to test their own code

Apply using this link:https://admin.goboon.co/job-post/1n6rglI14FcXqn0YXU
All the selection process is conducted through turing's platform (Boon App).

After your aplication, in a few days our Partnership Manager Leonardo Oliveira will be contacting you. He will guide you through the next steps.

If you have any question, please let me Know:
https://www.linkedin.com/in/patriciasilvabarbosa/
7.8K viewsO.P., 09:06
Buka / Bagaimana
2023-03-23 12:06:44 Algorithm and graph database engineer - A&
This position is full remote, salary in US Dollar and urgent.

Description:
We are on the lookout for a confident Tigergraph Developer - Remote to join our exceptional team

What you will be doing:
In This Role, you will be excellent in conceptual understanding of Graph Databases, property graphs, modelling, and data ingestion onto these databases

What we are looking for:
Hands-on exposure Graph Databases like Neo4J, Amazon Neptune or Dgraph, Janus, Azure COSMOS etc., TigerGraph (MUST HAVE),
expertise to model data in form of graph nodes and vertices,
Python/Nodejs/Java, expertise in Cypher and Gremlin query languages,
NW science concepts & graph algo.
to Graph Databases like Neo4J, Amazon Neptune or Dgraph, Janus, Azure COSMOS etc.
Ability to write effective queries with SPARQL or Gremlin or DQL or GSQL or CQL.
Hands on experience in data engineering activities.
Hands on experience with TigerGraph (MUST HAVE)
Ability to apply graph algorithms to drive insights.
Understanding of NoSQL databases, Graph OLTP and OLAP processes.
Hands-on expertise to model data in form of graph nodes and vertices.
Hands-on exposure to programming and scripting language like Python/Nodejs/Java.
Knowledge of working on cloud platforms like GCP, AWS, Azure etc.
Knowledge and experience of CI-CD tools
It is an added advantage if you have experience in Optimizing graph database designs for capacity planning, performance, and scale or complexity of data and query patterns
Ogma visualization library, Linkurious (Graph visualization) or similar
Neo4j Certified Dev. certification (Pref.)

Good to have :
Knowledge of working on cloud platforms like GCP, AWS, Azure etc.
Knowledge and experience of CI-CD tools,
experience in Optimizing graph database designs for capacity planning, performance, and scale or complexity of data and query patterns,
Knowledge and experience of Machine Learning.
Ogma visualization library (BIG PLUS)

Apply using this link: https://admin.goboon.co/job-post/1n6rgnaLDfNFXPtspY
All the selection process is conducted through turing's platform (Boon App).

After your aplication, in a few days our Partnership Manager Leonardo Oliveira will be contacting you. He will guide you through the next steps.

If you have any question, please let me Know:
https://www.linkedin.com/in/patriciasilvabarbosa/
8.2K viewsO.P., 09:06
Buka / Bagaimana
2023-03-19 19:37:12 https://www.linkedin.com/posts/workwithvish_senior-data-scientist-at-groww-activity-7036212113152909312-hJLV?utm_source=share&utm_medium=member_android

Hiring Senior Data Scientist at Groww
10.0K viewsO.P., 16:37
Buka / Bagaimana