Network Performance Security and Preproduction Tester -6 month Internship
Lacework
Project 1 6 Month internship (office based in our Sophia Antipolis office, Valbonne)
Role Overview:
-We conduct rigorous performance testing of our hardware and software using industry-leading traffic generators, including Keysight BreakingPoint, VIAVI Avalanche, and CyberFlood. We utilize a custom automation framework to execute these tests and publish results for our Systems Engineering (SE) community.
-Our standard methodology begins with a goal-seeking phase to identify the peak performance threshold (the "best value"). Once established, we validate this result by running additional tests at -10% and +10% load increments to ensure stability and accuracy.
The Challenge:
-Upgrading testing tools often introduces regressions in our automation scripts or inconsistencies in performance metrics. To address this, we need to develop a pre-production environment on Virtual Machines (VMs). This environment will act as a regression suite to validate tool upgrades by comparing new results against baseline data at identical load levels.
Technical Requirements:
-Virtualization : experience deploying and managing pre-production environments using KVM, ESXi, or Proxmox.
-Networking Expertise: Deep understanding of L2 switching, routing protocols, and L4-L7 protocols (IPv4/v6, TCP, UDP) to interpret test results accurately.
-System Monitoring: Mastery of Linux system diagnostic tools like mpstat, top, and htop to analyze hardware behavior under load.
-Development & language/scripting: Strong scripting and Python skills for environment deployment are required.
-Performance Standards: Experience with RFC 2544 or RFC 9411 (benchmarking methodology) using Spirent, Keysight, or VIAVI tools is a significant plus.
Education:
Current student in Computer Science or a related technical field.
Professional proficiency in English.
#LI-NC1
We conduct rigorous performance testing of our hardware and software using industry-leading traffic generators, including Keysight BreakingPoint, VIAVI Avalanche, and CyberFlood. We utilize a custom automation framework to execute these tests and publish results for our Systems Engineering (SE) community. Our standard methodology begins with a goal-seeking phase to identify the peak performance threshold (the "best value"). Once established, we validate this result by running additional tests at -10% and +10% load increments to ensure stability and accuracy.