WHO WE ARE
SwipeSense is a healthcare technology company on a mission to eliminate harm and waste in medicine. Hospitals use our platform to prevent infections, make better use of their equipment, and improve the patient experience.
We believe in the power of data and partner with hospital teams to provide them with insights to sustain positive behavior change and deliver a predictable patient journey. We value relentless experimentation, a locked-arms attitude, and a shared purpose to improve the future of healthcare.
Our vision is to be the safety platform for hospitals with a growing number of applications such as hand hygiene compliance improvement, asset tracking, nursing insights, and contact tracing.
How do we accomplish this? SwipeSense currently deploys over 100,000 unique location sensors at hospitals nation-wide. These sensors stream location information in real-time to an event data pipeline which is transformed into location-based workflow insights for our end users.
WHO WE NEED
SwipeSense builds and manages large-scale distributed IoT hardware networks that produce massive amounts of real-time sensor data. We're looking for software engineers that have architected, worked in, or are excited to experience large scale, high volume, big data systems. We process billions of data points and terabytes of data daily and are scaling to petabyte-scale workloads.
At SwipeSense, we use the right tools for the job – and with a system at this scale, we live on the cutting edge of technologies, always questioning and rebuilding core components of our platform to meet the scaling demands of the future. As a member of this team, you will be responsible for designing, implementing, and maintaining these systems. We're looking for people with strong opinions around what you're building and how it's being built.
You'll be responsible for helping to architect our platform, from redesigning our data pipelines to evolving our algorithms to scaling our web services. You'll be working in our SwipeSense cloud (AWS) stack and helping collect, process, and analyze streaming sensor data and expose those results to various internal and external consumers in real-time.
This role has wide breadth, and will tackle a variety of technologies and concepts – from messaging systems in Kinesis to processing steps in Lambdas to ETL systems in SQL to Python Algorithms to Services in Go to Ruby / Rails APIs to React Client Apps.
Share in the evolution and design of the IoT data processing platform by prototyping, and building new technologies to optimize or completely replace existing platform components.
Assist with algorithm development to transform real-time IoT sensor data into customer insights.
Design and model highly scalable and performant data stores that are optimized for specific access patterns.
Build and optimize data processing algorithms written in Go.
Develop and maintain the GraphQL API.
Optimize and design for high-concurrency workloads.
Develop in React on the frontend to display real-time data in an elegant fashion through an intuitive SPA.
Provide thorough regression tests and alerting procedures to provide resiliency to new features.
Thoughtfully review code to promote a collaborative environment with a focus on arriving at the best solution as a team.
Write well-documented, thoughtfully organized code for people, not just for machines.
Work directly with Product Management and fellow engineers to fully understand, desrisk, and disambiguate the problem and its constraints before owning the solution.
Minimum 5+ years of related full-time experience.
Bachelors in Computer Science or Computer Engineering or equivalent experience.
Experience with writing performant, optimized code for large scale applications (N+1 queries should give you conniptions).
Experience with big data systems and real-time stream processing.
Experience in environments with multiple teams and distributed systems.
Experience with MVC frameworks and an understanding of RESTful APIs.
A focus on consistent code style, full-coverage testing, team coding conventions, and separation of concerns.
Experience working in an agile development environment.
Comfort in developing in a terminal-heavy Linux/Mac environment.
Excellent communication skills - plays well with others, ping pong skills optional.
A curious mind and a love for navigating complex solution spaces.
Bonus if you have:
Familiarity with AWS as a platform along with its various services (kinesis, dynamodb, and RDS are used heavily here).
Experience contributing to shared devops concerns such as CI servers, and automated deployment.
Familiarity with container-based infrastructure with Docker and Kubernetes.
Contributions to open-source projects.
Experience with data visualization and analytics.
If your experience looks a little different from what we've identified and you think you can excel in this role, we'd love to learn more about you.
Please note: position is a full-time remote position operating in US central time. Candidates must be a US Citizen, or a foreign citizen with a required work visa.