Home
Professional Experience
Education
Lifestyle and Interests

George Bishop

George BishopGeorge BishopGeorge Bishop
Home
Professional Experience
Education
Lifestyle and Interests
More
  • Home
  • Professional Experience
  • Education
  • Lifestyle and Interests

George Bishop

George BishopGeorge BishopGeorge Bishop
  • Home
  • Professional Experience
  • Education
  • Lifestyle and Interests

Technological Background

Azure Cloud

Experienced in Azure cloud for application hosting and company infrastructure hosting and management.

VM management, Kubernetes cluster setup and management, Azure static web apps, cost management and reduction exercises, access control and permissions administration, Integration of 3rd party services into Azure subscriptions.

Hosted .NET applications, React apps, python apps, Microsoft SQL Server and Postgres Servers (azure-managed and manually-mamaged), Mailing clients, staff VMs, server VMs. Deployed into both Linux and Windows environments.

.NET

Worked extensively in the .NET development space, covering .NET Framework, Standard, and Core environments.


WPF desktop development, REST API development hosted in Windows Services and ASP Web Services, embedded electronics and firmware deployment.


Primarily C# , also covering Visual C++ and embedded C development.

Robotics and Electromechanical Systems

Robotics and Electromechanical Systems

Introduced high quality automatic cell imaging systems into the BioTech market, covering functionality including:

  1. Micron-accurate mechanical positioning
  2. Brightfield and Fluorescent high resolution imaging
  3. AI image processing algorithm integration
  4. Robotic handling of high-value biological samples

DevOps

Documentation Control

Robotics and Electromechanical Systems

Extensively used the Azure DevOps platform to enable disciplined, business-orientated Agile development practices in development teams.


Tracking requirements via Epics and Features to allow visibility for business Product Owners/Managers.


Supporting development specialist teams to have full scope while creating User Stories allowing technical development of the platform.

Automated Testing

Documentation Control

Documentation Control

Setup and integrated automation technologies such as TestComplete, Ranorex, and Selenium.


Regression testing built into build pipelines and dedicated sprint resource for ensuring new development is covered. 


Retroactive building of use-cases and test suites for legacy products to stablize them for feature additions or maintenance.

Documentation Control

Documentation Control

Documentation Control

Using Atlassian's Confluence platform to empower knowledge sharing from the software department via live, up-to-date documentation.


Documents targeted at specific departments to allow control over the information which was available.


Creation of on-boarding materials, detailed requirements capture documentation, or service bulletins/memos for re-circulation.

Software Development Environment

Software Development Environment

Software Development Environment

Setup and management of development environments in Visual Studio with GIT, ReSharper, and ensuring functionality with Azure DevOps Test Suite.


Ensured standardized development practices via:

  • Shared ReSharper configuration to standardize code style
  • Requiring recorded execution of Acceptance Criteria and Test Cases by development personnel

Quintain Analytics (June 2024 - PRESENT)

Senior Product Developer

After Advanced Instruments, I took a position at Quintain Analytics, an ex-Hong Kong FinTech startup which had recently moved back to the UK.


I took the combined role of Senior Product Developer, covering the responsibility of Product Management, Senior Software Development, and DevOps/Deployment/Infrastructure.

In this role, my focus included taking ownership of product platforms from design to deployment, architecting and executing refactor, redesign, and new product development, and interviewing new members of staff and onboarding them to the team.


As such, my role covered various areas, including:

• Establishing and expanding Azure resources to enhance flexibility and resource efficiency for development and deployment.

• Azure resource cost management, where we managed to cut our daily spend in half.

• Reviewing applications, interviewing prospective hires, and managing onboarding and workload of new employees.

• Restructuring Azure DevOps processes to enforce best practices in Epic, Feature, Story, Task, and Bug management, optimizing workflow and collaboration.

• Designing and implementing scalable CI/CD pipelines using YAML in Azure DevOps, accelerating automated build and deployment processes.

• Documenting intended code styling across multiple development areas (.NET, TypeScript, Python)

• Defining and documenting requirements with stakeholders, using Azure DevOps items and delivery plans to maintain clear project timelines.

• Architecting and deploying greenfield .NET API applications from concept to production, ensuring high-performance and secure systems.

• Developing automated ETL solutions for data synchronization between applications, improving data integrity and operational efficiency.

• Modernizing and refactoring applications in .NET6.0, .NET Framework 4.6.2, and .NET Core 2.2, aligning legacy systems with current standards.

• Leading database architecture and implementation with SQL Server and Postgres, delivering scalable, efficient data solutions.

• Assisting refactor of React-based front ends with TypeScript and JavaScript to improve user experience and maintainability.

• Managing Kubernetes deployment environments, ensuring scalability and high availability.

• Overseeing SSL certificate integration across multiple signing authorities, streamlining security measures.

• Configuring DNS for development and production access.

Advanced Instruments LLC (Aug 2021 - June 2024)

Software Engineering Manager

After briefly serving in the role of Senior Software Engineer, I moved into the role of the Software Engineering Manager responsible for the development of a greenfield software development platform which would unify the all of our new product range under a single software platform.


The scope of this project was to have a single platform capable of:

  • Executing hardware control and image capture from each of the new products in the product line
  • Introduce homogeneity in how data is stored and presented for review
  • Normalize the reporting functionalities of the products and allow them to be common across the product portfolio
  • Ensure data integrity for compliance with CFR 21 Part 11, GMP/GAMP and GLP
  • Provide a common and architecturally stable/extendable platform to power new product development


The challenges of this centred largely around the fact I had to ensure the longer term goals of this project were met (data homogeneity and integrity, stable platform, etc) while also catering to the rapid product development of the new product line.

As such, I had to pivot the team into a much more structured development workflow while simultaneously linking that workflow to sound requirement tracking for use by Product Owners/Managers.


DevOps & Requirements

New DevOps area (with accompanying Atlassian Confluence area) dedicated to traceability of the new product requirements, spit between Epics, Feature, and User Stories. Confluence used for heavy documentation, which includes meeting notes of in-depth discussions on features and products, which can be referenced by DevOps items. This allows for full richness of documentation without heavy amounts of information clogging up high level work items.


Product teams can track feature development via Epic/Feature level, while development specialist teams are able to use supporting live documentation from confluence to decompose high-level items into substituent User Stories, which is where team discussion on implementation belongs.


Workflow

To better structure workflow in the team, the following developmental practices were implemented:

  • Organisation of work delivery into 2-week sprints
  • Features estimated by the senior development team for very loose timeline scoping
  • Senior development team decompose Features into Stories
  • Full team estimate Stories to achieve more accurate shorter-term timeline
  • Backlog refinement with Product Owner to ensure priority is maintained with the business
  • Top stories from the backlog are dragged into next sprint with the team and broken down into Tasks/Bugs, estimated in hours
  • Stories will have Test Cases written against them by test resource
  • Tasks/Bugs are all developed against discrete branches, with the master protected via Pull Requests
  • Pull requests require someone from senior and regular teams to approve it before entering master
  • Tasks must have video proof captured by the developer and attached to the parent User Story
  • When all tasks/bugs on a Story are closed, Test resource assigns Test Cases to be executed by a team member (not anyone who developed the Story)
  • Once all Test Cases pass, the Story closes. When all Stories on a sprint are closed, an internal release is made for QA approval


This structure of workflow allowed us to maximize the efficiency in the team while allowing key touch points for other departments to steer priority and approval without interfering with the development cycle.

Solentim LTD (OCT 2016 - AUG 2021)

Software Engineer

I joined the software team at Solentim directly after finishing University, being brought in at a time they were expanding the team to support the research and development of a new product to market, a single cell seeding device called the VIPS. 

I began with modernizing the internal instrument production toolset, allowing for the low-level control, tuning, and alignment of instrumentation parts. This was focused on replacing a lot of the manual/subjective alignment and checks with repeatable quantitative ones, and wrapping complicated processes in more accessible packages so that less specialist training is required from production/service personnel.

Next, after the research phase of Solentim's VIPS had been done, I was assigned to the team making this product stable and targeted for the market.

This body of work included:

  • Optimizing image processing algorithms and mechanical/electrical control systems for speed
  • Replacing of the internal research-focused UI for a more streamlined customer-centric workflow
  • Reworking of architecture to enhance maintainability of application

I was then split out into a solo research project for the addition of two new high-value assays on our existing Cell Metric imaging systems. This required me working for the first time with laboratory and field applications/marketing teams to understand what these biological assays were meant to do and how they added value to the customer workflow. After identifying a number of areas where the current system would be deficient,  I had to develop a number of new product features in order to forward the research project, including:

  • Implementation and proof of new, highly accurate, auto-focusing mechanism
  • Development of new imaging methodology, achieving reduced optical interference and enhancing fluorescent imaging capabilities

There was then a team re-prioritization into linking the new VIPS and old Cell Metric products together in order to better integrate and facilitate sales as an integrated product line. I was assigned to the team as the domain specialist for our existing Cell Metric product, and we developed a system for sharing results between the two systems via WCF clients hosted in Windows Services installed on our instrument PCs.

Refocusing back on the research projects, it was decided that a new product which combined the imaging capabilities of the existing line with liquid handling capability would better fit the value add to the consumer. As such I was shifted to the research of a new liquid handling product, which included the translation of the new focusing and imaging capabilities I developed into the new architectural framework used for new product development, as well as the integration of Neural Network imaging processing models for cell identification.

Copyright © 2025 George Bishop - All Rights Reserved.

Powered by

  • Professional Experience
  • Education
  • Lifestyle and Interests

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.

DeclineAccept