Skip to content

Blog

Short essays on things that seem interesting

Can LLM really fix new bugs ?

This website is built using mkdocs-material, which I chose because it’s easy to maintain and comes with a built-in blog plugin. Overall, it’s been a smooth experience until today.

I ran into a strange issue while switching from a Conda-based setup to a standalone Python installation. Everything seemed fine at first, but when I ran mkdocs serve, the hot-reloading feature suddenly stopped working.

Naturally, I turned to ChatGPT for help. We went down a deep debugging rabbit hole, checking various dependencies. ChatGPT suggested verifying the versions of mkdocs, mkdocs-material, and watchdog. Despite 30 minutes of back-and-forth and trying different combinations, I still couldn’t pinpoint the problem.

Frustrated, I fell back on the old, reliable approach: Google. A quick search for “mkdocs serve not watching files” led me to the answer almost immediately. The first result pointed to a bug already filed on the MkDocs GitHub repository (https://github.com/mkdocs/mkdocs/issues/4032). The root cause turned out to be the click dependency. Downgrading click fixed the issue right away.

Painfully, I learned a valuable lesson: while ChatGPT can be a powerful assistant, it’s not a silver bullet for debugging - especially when dealing with newly introduced or obscure bugs. Sometimes, conventional search and existing bug reports are still the fastest path to a solution.

Robots or Physical AI ?

It's amazing how popular Robotics as become a regular topic in mainstream media. It is no longer science fiction. Autonomous cars are roaming the streets in SF and nobody bats an eye. Humanoid robots startups are floating like flies.

I start my undergrad in 2008 and people were referencing the DARPA Challenge 2004 as the state of the art.

One trend I’ve noticed recently is the push toward a new term: physical AI. Robots no longer seem to be the buzzword that excites people—or attracts funding—for tech startups. The terminology has shifted, but I’m not convinced the underlying direction has improved.

This shift makes me question the kinds of startups and technologies we are building today. Increasingly, we are focused on removing humans from the loop rather than enhancing human capability. We are building humanoid systems designed to replace people, yet we invest far less effort into assistive technologies that could meaningfully improve human lives.

Why aren’t we prioritizing AI systems that help a disabled person better sense, plan, and act in the world? Why aren’t we building tools that augment everyday workers—powered arms, wheeled legs, or intelligent exoskeletons—that reduce both cognitive and physical strain? Even relatively simple forms of augmentation could dramatically improve quality of life and productivity.

So the real question is this: why are we so focused on building AI to replace humans, rather than AI designed to assist and empower them?

The Evolving Landscape of ROS Visualization

When I first started working with ROS back in 2018, I was just getting my feet wet in the world of mobile robotics. ROS has been around since 2011. Since then, RViz was the de facto 3D visualization tool, and rqt_plot was the go-to option for visualizing time series data from ROS topics. These tools were essential, if somewhat limited.

Fast forward to 2025, and the ROS visualization ecosystem has come a long way. Over the years, we’ve seen several visualization tools enter the space—some enduring, some fading away. One notable effort was Webviz by Cruise Automation, an open-source web-based visualizer for ROS bag files. Although Webviz offered a modern interface and browser-based convenience, active development was discontinued after Cruise shifted focus around 2021–2022 (source). The GitHub repository remains available, but it is effectively unmaintained.

On the other hand, PlotJuggler has remained a reliable tool for introspecting ROS data. It supports dynamic message parsing, real-time plotting, and flexible drag-and-drop layouts for time series inspection. It's still under active development and often integrated into real-world robot debugging workflows.

Then came Foxglove Studio—a cross-platform, modern visualization tool inspired by Webviz. Released in 2021, Foxglove Studio supports live data streaming and ROS bag playback, and provides powerful extensibility via plugins and custom panels. It works with both ROS 1 and ROS 2, making it a popular choice in today’s hybrid robotic stacks (GitHub).

Tool Pros Cons
RViz Open source, ROS-native, 3D capable Requires roscore; not ideal for offline bag inspection
Webviz Browser-based, open-source, intuitive UI No longer actively maintained by Cruise
PlotJuggler Dynamic message parsing, flexible layouts UI can be overwhelming for new users
Foxglove Web-native, cross-platform, ROS 1 & 2 support Some features gated behind cloud services or setup overhead

Between 2018 and 2025, as someone working on real-world robotics systems, visualizing ROS bags at scale has remained a tough challenge. At the end of the day, do you have enough human hours to manually inspect 10, 100, or even 1,000 bag files?

Some may argue that automation is the answer: write code to flag issues programmatically. And while that’s partially true, the reality is that you often need to see the problem first before you can define it well enough to detect it automatically. This is where visualization remains indispensable.

You don't want your most skilled developers spending time manually reviewing every bag file. Investigating visual anomalies—like odd navigation behavior or sensor glitches—is something the human eye excels at. Sure, AI can help here, but training those models still requires significant human input.

While large language models benefit from vast, general-purpose datasets, robotics data is much more domain-specific. It often requires specialists to label and annotate correctly, especially for nuanced behaviors that only an experienced engineer would recognize.

To support my personal learning—and to help tackle this problem at scale—I decided to build a custom, static visualization using Bokeh, a Python-based interactive plotting library. For ground mobile robots, I’ve found that 2D visualization is often sufficient for trained specialists to review and interpret the data effectively.

While Bokeh doesn’t yet offer strong support for 3D visualization, its 2D rendering capabilities are both elegant and performant. It outputs pure HTML and JavaScript, enabling animations to run smoothly in-browser without requiring any backend. Even better, these HTML files are fully self-contained and portable—they can be hosted on any static web server or opened locally with just a double-click.

To be continued ...

College is a Stock Option

Recently I came across Professor Andrew Lo's MIT OCW videos on Finance theory on youtube. In the video talking about stock options, he used the analogy of thinking of higher education as a option, or a call option to be exact. You paid tuition as premium. At the end of the 4 years, the degree may earn you a job with a good starting salary, just as the value of the option become as the stock price hit the strike price. The most you can lose is the tuition aka option premium.

This is a pretty compelling point to counter the catch all "College is a scam" trend that is floating around social media nowadays. College is an option but not a guarantee, you can lose your premium if you don't pick the right stock. You will also need to put extra effort in college, as compare to an option.

Aspect Higher Education Stock Options
Investment Tuition, time, and effort to gain knowledge, skills, and credentials. The premium paid to acquire the option.
Potential Benefit Higher lifetime earnings, career mobility, and personal growth. Unlimited profit if the stock price moves favorably.
Risk Losing tuition and time if the education is not leveraged effectively. Loss of the premium if the stock price doesn’t move favorably.
Effort Required Active participation (studying, internships, networking) to realize the benefits. None after purchase; value depends solely on market movements.

ML on the weather news channel

I came upon a news video talking about weather forecast and the effect of Machine Learning on the speed and accuracy of ML compared to traditional weather model (I'm guessing numerical physics model). I gotta say I'm quite impressed by how popular ML is getting especially getting a piece on the news.

Image title

The ML model is also found to outperform the conventional models. However ML model does require more data.

Image title

Source: How AI is chaning the game for weather forcasting

Github: Graphcast

Blogging start

I want to start logging what I've learned and observed over time. My future self looking back at this may be interesting. =)