-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathexperiences.html
90 lines (80 loc) · 5.44 KB
/
experiences.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="chrome=1">
<title>Work Experience - Niranjan Sujay</title>
<link rel="stylesheet" href="stylesheets/styles.css">
<meta name="viewport" content="width=device-width, initial-scale=1">
</head>
<body>
<div class="wrapper">
<!-- Horizontal Navigation Bar -->
<nav style="text-align: right; margin-top: 30px;">
<ul style="list-style-type: none; display: inline-block;">
<li style="display: inline-block; margin: 0 20px;"><a href="index.html">Home</a></li>
<li style="display: inline-block; margin: 0 20px;"><a href="about.html">About Me</a></li>
<li style="display: inline-block; margin: 0 20px;"><a href="projects.html">Projects</a></li>
<li style="display: inline-block; margin: 0 20px;"><a href="publications.html">Publications</a></li>
<li style="display: inline-block; margin: 0 20px;"><a href="skills.html">Skills</a></li>
<li style="display: inline-block; margin: 0 20px;"><a href="resume.html">Resume</a></li>
</nav>
<section>
<h2>Robotic Research Intern</h2>
<p><strong>AI4CE Lab, NYU Tandon</strong> | <em>May 2024 - Present</em></p>
<ul>
<li><strong>UrbanNav Project:</strong> Deployed a sensor-equipped Unitree Go1 robotdog with Livox LiDAR, u-blox GPS, and Insta360 camera, capturing over 120 hours of RGB, LiDAR, and GPS data. Training agents on thousands of hours of in-the-wild city walking and driving videos sourced from the web. Improved real-world navigation accuracy by 77.3%—a 20% increase over previous methods (ViNT, NoMaD). Achieved top metrics in right-turn and crossing scenarios (Arrival 87.8%, AOE 4.63°).</li>
<li><strong>MappingNYC Project:</strong> Designed a custom hardware mount with dual-LiDAR, dual 360 cameras, and GPS, collecting 160+ hours of data for high-precision NYC mapping. Processed 2–3 TB of data per collection round, overcoming Fast-LIO limitations by segmenting rosbags for manageable waypoint processing. Employed interactive SLAM, enhancing loop closure accuracy and environmental fidelity.</li>
<li><strong>Curb2Door Project:</strong> Created a handheld sensor mount for 3D point cloud and image data collection in smaller urban environments. Combined 3D Gaussian Splatting and R3LIVE to produce high-detail 3D models of uneven terrains, improving autonomous navigation accuracy on varied surfaces by up to 30%.</li>
</ul>
</section>
<section>
<h2>Robotic Research Intern</h2>
<p><strong>Green Quest Solutions Private Limited, Singapore</strong> | <em>Sep 2021 - Aug 2022</em></p>
<ul>
<li><strong>Custom YOLO Model for Waste Detection:</strong> Developed a YOLO-based model for precise waste categorization, integrating Intel RealSense for depth measurement. Achieved an 85-96% improvement in waste dimension detection and sorting accuracy by refining depth perception algorithms.</li>
<li><strong>Custom Circuit Design:</strong> Engineered custom circuits integrating multiple sensors and a power distribution board, optimizing power management. Reduced power discharge rates and recharge time by 15%, extending system runtime by approximately 50%.</li>
<li><strong>Flexible Robotic Arm for Tree Health Monitoring:</strong> Built a sensor-equipped robotic arm to measure tree health metrics across a million trees, reducing assessment time from two years to six months.</li>
</ul>
</section>
<section>
<h2>Robotic Intern</h2>
<p><strong>Flux Auto, Bengaluru, India</strong> | <em>Dec 2019 - Feb 2020</em></p>
<ul>
<li><strong>Real-Time Object Recognition and Sensor Integration:</strong> Designed and implemented real-time object recognition on an NVidia Jetson Nano developer board, achieving a 60% improvement in detection accuracy for autonomous vehicle applications.</li>
<li><strong>Motion Planning and Decision-Making for AVs:</strong> Contributed to the design of motion planning algorithms, focusing on predictable, safe, and smooth vehicle behavior in complex urban navigation.</li>
</ul>
</section>
<footer>
<p>This experience page is maintained by <a href="https://github.com/YourGitHubUsername">Niranjan Sujay</a></p>
<div class="socials">
<a href="https://www.linkedin.com/in/your-linkedin-profile" target="_blank">LinkedIn</a> |
<a href="https://twitter.com/your-twitter-profile" target="_blank">Twitter</a> |
<a href="https://github.com/your-github-profile" target="_blank">GitHub</a> |
<a href="https://www.researchgate.net/profile/your-researchgate-profile" target="_blank">ResearchGate</a>
</div>
</footer>
</div>
<div class="circle"></div>
<script>
const circle = document.querySelector('.circle');
let targetX = 0;
let targetY = 0;
let currentX = 0;
let currentY = 0;
let speed = 0.1;
document.addEventListener('mousemove', (e) => {
targetX = e.clientX;
targetY = e.clientY;
});
function updateCirclePosition() {
currentX += (targetX - currentX) * speed;
currentY += (targetY - currentY) * speed;
circle.style.left = `${currentX - circle.offsetWidth / 2}px`;
circle.style.top = `${currentY - circle.offsetHeight / 2}px`;
requestAnimationFrame(updateCirclePosition);
}
updateCirclePosition();
</script>
</body>
</html>