A serverless GitHub manager for finding non-mutuals. Uses a client-heavy architecture with GitHub Gists as a database and adaptive caching to respect API limits.



After a period of intense work through 2024, I took a necessary step back from development in early 2025. When I was ready to dive back in a few months later, I found my GitHub account buzzing with activity. My network had grown significantly, and I had no easy way to manage it or understand my connections.
My search for a solution led me to some existing tools, like Hesbon-Osoro's follow-for-follow-back project. It was a great starting point, but I quickly identified its limitations. To unfollow someone, I was redirected to their GitHub page to click the button manually, and there was no way to perform bulk actions. This friction sparked an idea. I would build my own tool, not just to solve my own problem, but also as the perfect challenge to reignite my passion for coding.
I had two core principles for this project: the architecture had to be simple, and it had to be free to run. I also knew that for a tool like this to be truly useful, it couldn't rely on a shared, easily exhausted API rate limit. The first major decision was to use a GitHub OAuth App, giving every user their own 5,000 requests/hour quota to work with.
My initial impulse was to use Next.js's server capabilities to fetch a user's entire network. However, I quickly realized this would be a dead end for power users. A full network sync for an account with tens of thousands of connections could easily exceed the execution limits on Vercel's free tier. Since the app's primary objective was completeness (you need the full picture to accurately calculate non-mutuals), a partial fetch wasn't an option. This forced a crucial pivot:
With the fetching strategy decided, I needed a place to cache the processed data. As I explored options, the solution hit me: why not use GitHub itself? I could leverage GitHub Gists as a user-owned database. For state management, I chose a combination of React Query for server state and Zustand for client state.
Using Gists as a database is the cornerstone of the project's "GitHub-as-Infrastructure" approach. The benefits are significant:
However, this approach requires accepting certain trade-offs, which can be viewed through the lens of the CAP theorem's principles: Consistency, Availability, and Partition Tolerance. For a tool like FollowSync, you can only optimize for two of these three. The app prioritizes:
It does this by sacrificing strict Consistency (or freshness). The data in the cache is a snapshot, not a real-time reflection of a user's GitHub network. This trade-off is the key to providing a fast, resilient, and free service.

View the full mermaid diagram here.
With the foundation laid, I ran into three interesting technical challenges that defined the project.
A "one-size-fits-all" caching strategy was never going to work. The solution was an adaptive, stale-while-revalidate caching system. The application first loads instantly from the Gist cache, then checks the cache's timestamp against a stale time that varies based on the user's network size.
To understand why this was critical, let's look at the numbers. GitHub's GraphQL API calculates rate limit costs based on the number of nodes requested, roughly 1 point per 100 nodes in the larger of the follower/following lists.
This calculation makes it clear that a dynamic approach based on network size isn't just a nice-to-have; it's essential for the tool to function for its target power users.
I developed a tiered caching strategy based on the total number of connections (followers + following). This ensures the cache remains fresh for smaller, more dynamic networks while preventing unnecessary API calls for larger, more stable ones.
Here's a detailed breakdown of the tiers:
And for the ultimate power user, the new settings modal provides a custom staleTime option. This allows any user to override the adaptive tiers and set their own cache duration in minutes, offering the highest level of control over the application's behavior.
While testing, I stumbled upon "ghost" accounts: deleted users who still appear in follower lists and lead to a 404 page. The GitHub API provides no way to detect them. The key insight was two-fold: ghosts primarily matter when they are non-mutuals, and they almost always have 0 followers and 0 following.
The final solution is a multi-step process:
A core goal was to eliminate friction by making actions instant and in-app. When a user follows or unfollows someone, I use an optimistic update. The UI reflects the change immediately, while the API call happens in the background. If it fails, React Query seamlessly reverts the UI and displays an error. I extended this pattern to bulk operations with a custom useBulkOperation hook, which iterates through a list of users, reports progress in real-time, and adds a small delay between requests to respect API limits.
From the outset, a core design principle for FollowSync was to empower "power users." While the adaptive caching system worked well for most, I wanted to provide more granular control, fully realizing the initial vision of a highly customizable tool. This led to the implementation of a user settings feature.
The primary goal was to allow users to override the default application behavior to better suit their specific needs. This included customizing the cache staleTime, adjusting the batch size for ghost detection, and tweaking UI elements like avatar visibility and pagination size.
The solution was a settings modal, managed by its own dedicated Zustand store, useSettingsStore. This approach kept the settings logic cleanly separated from other application state. The settings themselves are saved directly into the user's Gist cache as a distinct settings property, ensuring they persist across sessions. This implementation not only added the desired customizability but also reinforced the "GitHub-as-Infrastructure" philosophy by storing user preferences alongside their network data.
As FollowSync grew, the initial state management solution, a single useCacheStore, began to show signs of strain. It was becoming a classic "god store"; a monolithic entity handling network data, ghost detection state, Gist interactions, and caching logic. This tight coupling made the store difficult to maintain, test, and extend. It was functional, but it didn't meet my standards for a clean, scalable architecture.
The solution was a significant refactor guided by the principle of separation of concerns. I broke down the monolithic useCacheStore into several smaller, more focused stores:
To orchestrate these now-independent stores, I introduced a centralized custom hook, useCacheManager. This orchestrator is responsible for the application's core logic: initializing data, fetching the network, handling the cache, and persisting changes. It acts as the "brain," calling on the specialized stores to manage their respective slices of state.
This refactor did introduce new challenges, particularly a race condition where ghost verification could trigger before the full network data was fetched. I resolved this by structuring the orchestrator's functions as callbacks, ensuring a sequential and predictable flow of operations. The result is a far more robust, modular, and maintainable state management system that is well-positioned for future development.
Building FollowSync was an exercise in pragmatic engineering and a rewarding way to jump back into development.
With the core features and user settings now complete, the project has reached a new level of maturity. The road ahead is focused on continued maintenance, monitoring for any breaking changes in the GitHub API, and listening to user feedback for any future enhancements.
A serverless GitHub manager for finding non-mutuals. Uses a client-heavy architecture with GitHub Gists as a database and adaptive caching to respect API limits.



After a period of intense work through 2024, I took a necessary step back from development in early 2025. When I was ready to dive back in a few months later, I found my GitHub account buzzing with activity. My network had grown significantly, and I had no easy way to manage it or understand my connections.
My search for a solution led me to some existing tools, like Hesbon-Osoro's follow-for-follow-back project. It was a great starting point, but I quickly identified its limitations. To unfollow someone, I was redirected to their GitHub page to click the button manually, and there was no way to perform bulk actions. This friction sparked an idea. I would build my own tool, not just to solve my own problem, but also as the perfect challenge to reignite my passion for coding.
I had two core principles for this project: the architecture had to be simple, and it had to be free to run. I also knew that for a tool like this to be truly useful, it couldn't rely on a shared, easily exhausted API rate limit. The first major decision was to use a GitHub OAuth App, giving every user their own 5,000 requests/hour quota to work with.
My initial impulse was to use Next.js's server capabilities to fetch a user's entire network. However, I quickly realized this would be a dead end for power users. A full network sync for an account with tens of thousands of connections could easily exceed the execution limits on Vercel's free tier. Since the app's primary objective was completeness (you need the full picture to accurately calculate non-mutuals), a partial fetch wasn't an option. This forced a crucial pivot:
With the fetching strategy decided, I needed a place to cache the processed data. As I explored options, the solution hit me: why not use GitHub itself? I could leverage GitHub Gists as a user-owned database. For state management, I chose a combination of React Query for server state and Zustand for client state.
Using Gists as a database is the cornerstone of the project's "GitHub-as-Infrastructure" approach. The benefits are significant:
However, this approach requires accepting certain trade-offs, which can be viewed through the lens of the CAP theorem's principles: Consistency, Availability, and Partition Tolerance. For a tool like FollowSync, you can only optimize for two of these three. The app prioritizes:
It does this by sacrificing strict Consistency (or freshness). The data in the cache is a snapshot, not a real-time reflection of a user's GitHub network. This trade-off is the key to providing a fast, resilient, and free service.

View the full mermaid diagram here.
With the foundation laid, I ran into three interesting technical challenges that defined the project.
A "one-size-fits-all" caching strategy was never going to work. The solution was an adaptive, stale-while-revalidate caching system. The application first loads instantly from the Gist cache, then checks the cache's timestamp against a stale time that varies based on the user's network size.
To understand why this was critical, let's look at the numbers. GitHub's GraphQL API calculates rate limit costs based on the number of nodes requested, roughly 1 point per 100 nodes in the larger of the follower/following lists.
This calculation makes it clear that a dynamic approach based on network size isn't just a nice-to-have; it's essential for the tool to function for its target power users.
I developed a tiered caching strategy based on the total number of connections (followers + following). This ensures the cache remains fresh for smaller, more dynamic networks while preventing unnecessary API calls for larger, more stable ones.
Here's a detailed breakdown of the tiers:
And for the ultimate power user, the new settings modal provides a custom staleTime option. This allows any user to override the adaptive tiers and set their own cache duration in minutes, offering the highest level of control over the application's behavior.
While testing, I stumbled upon "ghost" accounts: deleted users who still appear in follower lists and lead to a 404 page. The GitHub API provides no way to detect them. The key insight was two-fold: ghosts primarily matter when they are non-mutuals, and they almost always have 0 followers and 0 following.
The final solution is a multi-step process:
A core goal was to eliminate friction by making actions instant and in-app. When a user follows or unfollows someone, I use an optimistic update. The UI reflects the change immediately, while the API call happens in the background. If it fails, React Query seamlessly reverts the UI and displays an error. I extended this pattern to bulk operations with a custom useBulkOperation hook, which iterates through a list of users, reports progress in real-time, and adds a small delay between requests to respect API limits.
From the outset, a core design principle for FollowSync was to empower "power users." While the adaptive caching system worked well for most, I wanted to provide more granular control, fully realizing the initial vision of a highly customizable tool. This led to the implementation of a user settings feature.
The primary goal was to allow users to override the default application behavior to better suit their specific needs. This included customizing the cache staleTime, adjusting the batch size for ghost detection, and tweaking UI elements like avatar visibility and pagination size.
The solution was a settings modal, managed by its own dedicated Zustand store, useSettingsStore. This approach kept the settings logic cleanly separated from other application state. The settings themselves are saved directly into the user's Gist cache as a distinct settings property, ensuring they persist across sessions. This implementation not only added the desired customizability but also reinforced the "GitHub-as-Infrastructure" philosophy by storing user preferences alongside their network data.
As FollowSync grew, the initial state management solution, a single useCacheStore, began to show signs of strain. It was becoming a classic "god store"; a monolithic entity handling network data, ghost detection state, Gist interactions, and caching logic. This tight coupling made the store difficult to maintain, test, and extend. It was functional, but it didn't meet my standards for a clean, scalable architecture.
The solution was a significant refactor guided by the principle of separation of concerns. I broke down the monolithic useCacheStore into several smaller, more focused stores:
To orchestrate these now-independent stores, I introduced a centralized custom hook, useCacheManager. This orchestrator is responsible for the application's core logic: initializing data, fetching the network, handling the cache, and persisting changes. It acts as the "brain," calling on the specialized stores to manage their respective slices of state.
This refactor did introduce new challenges, particularly a race condition where ghost verification could trigger before the full network data was fetched. I resolved this by structuring the orchestrator's functions as callbacks, ensuring a sequential and predictable flow of operations. The result is a far more robust, modular, and maintainable state management system that is well-positioned for future development.
Building FollowSync was an exercise in pragmatic engineering and a rewarding way to jump back into development.
With the core features and user settings now complete, the project has reached a new level of maturity. The road ahead is focused on continued maintenance, monitoring for any breaking changes in the GitHub API, and listening to user feedback for any future enhancements.
import { GraphQLClient } from 'graphql-request';
import { toast } from 'sonner';
import { useCallback } from 'react';
import { useNetworkStore } from '@/lib/store/network';
import { useGistStore } from '@/lib/store/gist';
import { useGhostStore } from '@/lib/store/ghost';
import { useSettingsStore } from '@/lib/store/settings';
import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
GIST_ID_STORAGE_KEY,
STALE_TIME_LARGE,
STALE_TIME_MANUAL_ONLY,
STALE_TIME_MEDIUM,
STALE_TIME_SMALL,
} from '@/lib/constants';
import { CachedData, ProgressCallbacks } from '@/lib/types';
import { UserInfoFragment } from '@/lib/gql/types';
export const useCacheManager = () => {
const setNetwork = useNetworkStore((state) => state.setNetwork);
const { setGhosts, } = useGhostStore();
const { setGistName, setGistData, } = useGistStore();
const settings = useSettingsStore();
const loadFromCache = useCallback(
(cachedData: CachedData) => {
setNetwork(cachedData.network);
setGhosts(cachedData.ghosts);
setGistData({
timestamp: cachedData.timestamp,
metadata: cachedData.metadata,
});
if (cachedData.settings) {
settings.setShowAvatars(cachedData.settings.showAvatars);
settings.setGhostDetectionBatchSize(
cachedData.settings.ghostDetectionBatchSize
);
settings.setPaginationPageSize(cachedData.settings.paginationPageSize);
settings.setCustomStaleTime(cachedData.settings.customStaleTime);
}
},
[setNetwork, setGhosts, setGistData, settings]
);
const initializeAndFetchNetwork = useCallback(
async (
client: GraphQLClient,
username: string,
accessToken: string,
progress: ProgressCallbacks
) => {
const { show, update, complete, fail } = progress;
const localGistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
setGistName(localGistName);
const isForced = useGistStore.getState().forceNextRefresh;
const currentGistName = useGistStore.getState().gistName;
if (isForced) {
useGistStore.getState().setForceNextRefresh(false);
}
if (!isForced) {
const foundGist = await findCacheGist(client, currentGistName);
if (foundGist) {
const cachedData = parseCache(foundGist);
if (cachedData) {
setGistName(foundGist.name);
const totalConnections = cachedData.metadata.totalConnections;
const { customStaleTime } = settings;
let staleTime = 0;
if (customStaleTime) {
staleTime = customStaleTime * 60 * 1000;
} else if (totalConnections <= 2000) {
staleTime = STALE_TIME_SMALL;
} else if (totalConnections <= 10000) {
staleTime = STALE_TIME_MEDIUM;
} else if (totalConnections <= 50000) {
staleTime = STALE_TIME_LARGE;
} else {
staleTime = STALE_TIME_MANUAL_ONLY;
}
const isStale = Date.now() - cachedData.timestamp > staleTime;
loadFromCache(cachedData);
if (!isStale) {
toast.info('Loaded fresh data from cache.');
return cachedData.network;
}
if (staleTime === STALE_TIME_MANUAL_ONLY) {
toast.info(
'Data loaded from cache. Refresh manually for the latest update.'
);
return cachedData.network;
}
}
}
}
const fetchStart = performance.now();
show({
title: 'Syncing Your Network',
message: 'Fetching connections from GitHub...',
items: [
{ label: 'Followers', current: 0, total: 0 },
{ label: 'Following', current: 0, total: 0 },
],
});
try {
const networkData = await fetchAllUserFollowersAndFollowing({
client,
username,
onProgress: (p) => {
update([
{
label: 'Followers',
current: p.fetchedFollowers,
total: p.totalFollowers,
},
{
label: 'Following',
current: p.fetchedFollowing,
total: p.totalFollowing,
},
]);
},
});
const fetchEnd = performance.now();
const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
const followers = networkData.followers.nodes as UserInfoFragment[];
const following = networkData.following.nodes as UserInfoFragment[];
const network = { followers, following };
const timestamp = Date.now();
const dataToCache: CachedData = {
network,
ghosts: [],
settings,
timestamp,
metadata: {
totalConnections: followers.length + following.length,
fetchDuration,
cacheVersion: '1.0',
},
};
const newGist = await writeCache(
accessToken,
dataToCache,
currentGistName
);
setNetwork(network);
setGistData({ timestamp, metadata: dataToCache.metadata });
setGistName(newGist.id);
complete();
return network;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} catch (err: any) {
fail({ message: err.message || 'Failed to sync network.' });
throw err;
}
},
[settings, loadFromCache, setGistName, setNetwork, setGistData]
);
// other functions
return {
initializeAndFetchNetwork,
loadFromCache,
// other functions
};
};
import { useSession } from 'next-auth/react';
import { useMutation } from '@tanstack/react-query';
import { toast } from 'sonner';
import { useNetworkStore } from '@/lib/store/network';
import { followUser, unfollowUser } from '@/lib/gql/fetchers';
import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
import { UserInfoFragment } from '@/lib/gql/types';
import { useModalsStore } from '@/lib/store/modals';
import { useCacheManager } from './useCacheManager';
export const useFollowManager = () => {
const { client } = useClientAuthenticatedGraphQLClient();
const { data: session } = useSession();
const { network, setNetwork } = useNetworkStore();
const { persistChanges } = useCacheManager();
const { incrementActionCount } = useModalsStore();
const followMutation = useMutation({
mutationFn: (userToFollow: UserInfoFragment) => {
if (!client) throw new Error('GraphQL client not available');
return followUser({ client, userId: userToFollow.id });
},
onMutate: async (userToFollow: UserInfoFragment) => {
const previousNetwork = network;
const newFollowing = [...network.following, userToFollow];
const newNetwork = { ...network, following: newFollowing };
setNetwork(newNetwork);
return { previousNetwork };
},
onError: (err, userToFollow, context) => {
if (context?.previousNetwork) {
setNetwork(context.previousNetwork);
}
toast.error(`Failed to follow @${userToFollow.login}: ${err.message}`);
},
onSettled: () => {
if (session?.accessToken) {
persistChanges();
}
},
});
const unfollowMutation = useMutation({
mutationFn: (userToUnfollow: UserInfoFragment) => {
if (!client) throw new Error('GraphQL client not available');
return unfollowUser({ client, userId: userToUnfollow.id });
},
onMutate: async (userToUnfollow: UserInfoFragment) => {
const previousNetwork = network;
const newFollowing = network.following.filter(
(u) => u.id !== userToUnfollow.id
);
const newNetwork = { ...network, following: newFollowing };
setNetwork(newNetwork);
return { previousNetwork };
},
onError: (err, userToUnfollow, context) => {
if (context?.previousNetwork) {
setNetwork(context.previousNetwork);
}
toast.error(
`Failed to unfollow @${userToUnfollow.login}: ${err.message}`
);
},
onSettled: () => {
if (session?.accessToken) {
persistChanges();
}
},
});
return {
followMutation,
unfollowMutation,
incrementActionCount,
};
};
import { GraphQLClient } from 'graphql-request';
import { toast } from 'sonner';
import { useCallback } from 'react';
import { useNetworkStore } from '@/lib/store/network';
import { useGistStore } from '@/lib/store/gist';
import { useGhostStore } from '@/lib/store/ghost';
import { useSettingsStore } from '@/lib/store/settings';
import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
GIST_ID_STORAGE_KEY,
STALE_TIME_LARGE,
STALE_TIME_MANUAL_ONLY,
STALE_TIME_MEDIUM,
STALE_TIME_SMALL,
} from '@/lib/constants';
import { CachedData, ProgressCallbacks } from '@/lib/types';
import { UserInfoFragment } from '@/lib/gql/types';
export const useCacheManager = () => {
const setNetwork = useNetworkStore((state) => state.setNetwork);
const { setGhosts, } = useGhostStore();
const { setGistName, setGistData, } = useGistStore();
const settings = useSettingsStore();
const loadFromCache = useCallback(
(cachedData: CachedData) => {
setNetwork(cachedData.network);
setGhosts(cachedData.ghosts);
setGistData({
timestamp: cachedData.timestamp,
metadata: cachedData.metadata,
});
if (cachedData.settings) {
settings.setShowAvatars(cachedData.settings.showAvatars);
settings.setGhostDetectionBatchSize(
cachedData.settings.ghostDetectionBatchSize
);
settings.setPaginationPageSize(cachedData.settings.paginationPageSize);
settings.setCustomStaleTime(cachedData.settings.customStaleTime);
}
},
[setNetwork, setGhosts, setGistData, settings]
);
const initializeAndFetchNetwork = useCallback(
async (
client: GraphQLClient,
username: string,
accessToken: string,
progress: ProgressCallbacks
) => {
const { show, update, complete, fail } = progress;
const localGistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
setGistName(localGistName);
const isForced = useGistStore.getState().forceNextRefresh;
const currentGistName = useGistStore.getState().gistName;
if (isForced) {
useGistStore.getState().setForceNextRefresh(false);
}
if (!isForced) {
const foundGist = await findCacheGist(client, currentGistName);
if (foundGist) {
const cachedData = parseCache(foundGist);
if (cachedData) {
setGistName(foundGist.name);
const totalConnections = cachedData.metadata.totalConnections;
const { customStaleTime } = settings;
let staleTime = 0;
if (customStaleTime) {
staleTime = customStaleTime * 60 * 1000;
} else if (totalConnections <= 2000) {
staleTime = STALE_TIME_SMALL;
} else if (totalConnections <= 10000) {
staleTime = STALE_TIME_MEDIUM;
} else if (totalConnections <= 50000) {
staleTime = STALE_TIME_LARGE;
} else {
staleTime = STALE_TIME_MANUAL_ONLY;
}
const isStale = Date.now() - cachedData.timestamp > staleTime;
loadFromCache(cachedData);
if (!isStale) {
toast.info('Loaded fresh data from cache.');
return cachedData.network;
}
if (staleTime === STALE_TIME_MANUAL_ONLY) {
toast.info(
'Data loaded from cache. Refresh manually for the latest update.'
);
return cachedData.network;
}
}
}
}
const fetchStart = performance.now();
show({
title: 'Syncing Your Network',
message: 'Fetching connections from GitHub...',
items: [
{ label: 'Followers', current: 0, total: 0 },
{ label: 'Following', current: 0, total: 0 },
],
});
try {
const networkData = await fetchAllUserFollowersAndFollowing({
client,
username,
onProgress: (p) => {
update([
{
label: 'Followers',
current: p.fetchedFollowers,
total: p.totalFollowers,
},
{
label: 'Following',
current: p.fetchedFollowing,
total: p.totalFollowing,
},
]);
},
});
const fetchEnd = performance.now();
const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
const followers = networkData.followers.nodes as UserInfoFragment[];
const following = networkData.following.nodes as UserInfoFragment[];
const network = { followers, following };
const timestamp = Date.now();
const dataToCache: CachedData = {
network,
ghosts: [],
settings,
timestamp,
metadata: {
totalConnections: followers.length + following.length,
fetchDuration,
cacheVersion: '1.0',
},
};
const newGist = await writeCache(
accessToken,
dataToCache,
currentGistName
);
setNetwork(network);
setGistData({ timestamp, metadata: dataToCache.metadata });
setGistName(newGist.id);
complete();
return network;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} catch (err: any) {
fail({ message: err.message || 'Failed to sync network.' });
throw err;
}
},
[settings, loadFromCache, setGistName, setNetwork, setGistData]
);
// other functions
return {
initializeAndFetchNetwork,
loadFromCache,
// other functions
};
};
import { useSession } from 'next-auth/react';
import { useMutation } from '@tanstack/react-query';
import { toast } from 'sonner';
import { useNetworkStore } from '@/lib/store/network';
import { followUser, unfollowUser } from '@/lib/gql/fetchers';
import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
import { UserInfoFragment } from '@/lib/gql/types';
import { useModalsStore } from '@/lib/store/modals';
import { useCacheManager } from './useCacheManager';
export const useFollowManager = () => {
const { client } = useClientAuthenticatedGraphQLClient();
const { data: session } = useSession();
const { network, setNetwork } = useNetworkStore();
const { persistChanges } = useCacheManager();
const { incrementActionCount } = useModalsStore();
const followMutation = useMutation({
mutationFn: (userToFollow: UserInfoFragment) => {
if (!client) throw new Error('GraphQL client not available');
return followUser({ client, userId: userToFollow.id });
},
onMutate: async (userToFollow: UserInfoFragment) => {
const previousNetwork = network;
const newFollowing = [...network.following, userToFollow];
const newNetwork = { ...network, following: newFollowing };
setNetwork(newNetwork);
return { previousNetwork };
},
onError: (err, userToFollow, context) => {
if (context?.previousNetwork) {
setNetwork(context.previousNetwork);
}
toast.error(`Failed to follow @${userToFollow.login}: ${err.message}`);
},
onSettled: () => {
if (session?.accessToken) {
persistChanges();
}
},
});
const unfollowMutation = useMutation({
mutationFn: (userToUnfollow: UserInfoFragment) => {
if (!client) throw new Error('GraphQL client not available');
return unfollowUser({ client, userId: userToUnfollow.id });
},
onMutate: async (userToUnfollow: UserInfoFragment) => {
const previousNetwork = network;
const newFollowing = network.following.filter(
(u) => u.id !== userToUnfollow.id
);
const newNetwork = { ...network, following: newFollowing };
setNetwork(newNetwork);
return { previousNetwork };
},
onError: (err, userToUnfollow, context) => {
if (context?.previousNetwork) {
setNetwork(context.previousNetwork);
}
toast.error(
`Failed to unfollow @${userToUnfollow.login}: ${err.message}`
);
},
onSettled: () => {
if (session?.accessToken) {
persistChanges();
}
},
});
return {
followMutation,
unfollowMutation,
incrementActionCount,
};
};
import { useEffect } from 'react';
import { useSession } from 'next-auth/react';
import { useGhostStore } from '@/lib/store/ghost';
import { useNetworkStore } from '@/lib/store/network';
import { useSettingsStore } from '@/lib/store/settings';
import { useCacheManager } from './useCacheManager';
const DELAY_BETWEEN_BATCHES = 1000; // 1 second
interface GhostDetectorProps {
isNetworkReady: boolean;
}
export const useGhostDetector = ({ isNetworkReady }: GhostDetectorProps) => {
const { data: session } = useSession();
const accessToken = session?.accessToken;
const { nonMutuals } = useNetworkStore();
const { nonMutualsFollowingYou, nonMutualsYouFollow } = nonMutuals;
const { ghosts, setIsCheckingGhosts } = useGhostStore();
const { ghostDetectionBatchSize } = useSettingsStore();
const { updateGhosts } = useCacheManager();
useEffect(() => {
const detectGhosts = async () => {
if (!isNetworkReady || !accessToken) {
setIsCheckingGhosts(false);
return;
}
setIsCheckingGhosts(true);
const potentialGhosts = [
...nonMutualsYouFollow,
...nonMutualsFollowingYou,
].filter(
(user) =>
user?.followers.totalCount === 0 && user?.following.totalCount === 0
);
const newPotentialGhosts = potentialGhosts.filter(
(potentialGhost) =>
!ghosts.some(
(existingGhost) => existingGhost.login === potentialGhost.login
)
);
if (newPotentialGhosts.length === 0) {
setIsCheckingGhosts(false);
return;
}
const confirmedGhosts = [];
for (
let i = 0;
i < newPotentialGhosts.length;
i += ghostDetectionBatchSize
) {
const batch = newPotentialGhosts.slice(i, i + ghostDetectionBatchSize);
const usernames = batch.map((user) => user?.login);
try {
const response = await fetch('/api/verify-ghosts', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ usernames }),
});
if (response.ok) {
const { ghosts: ghostUsernames } = await response.json();
const batchGhosts = batch.filter((user) =>
ghostUsernames.includes(user?.login)
);
confirmedGhosts.push(...batchGhosts);
}
} catch (error) {
console.error('Error verifying ghost batch:', error);
}
if (i + ghostDetectionBatchSize < newPotentialGhosts.length) {
await new Promise((resolve) =>
setTimeout(resolve, DELAY_BETWEEN_BATCHES)
);
}
}
if (confirmedGhosts.length > 0) {
await updateGhosts(confirmedGhosts);
}
setIsCheckingGhosts(false);
};
detectGhosts();
}, [
isNetworkReady,
nonMutualsFollowingYou,
nonMutualsYouFollow,
accessToken,
ghosts,
setIsCheckingGhosts,
ghostDetectionBatchSize,
updateGhosts,
]);
};
import { useEffect } from 'react';
import { useSession } from 'next-auth/react';
import { useGhostStore } from '@/lib/store/ghost';
import { useNetworkStore } from '@/lib/store/network';
import { useSettingsStore } from '@/lib/store/settings';
import { useCacheManager } from './useCacheManager';
const DELAY_BETWEEN_BATCHES = 1000; // 1 second
interface GhostDetectorProps {
isNetworkReady: boolean;
}
export const useGhostDetector = ({ isNetworkReady }: GhostDetectorProps) => {
const { data: session } = useSession();
const accessToken = session?.accessToken;
const { nonMutuals } = useNetworkStore();
const { nonMutualsFollowingYou, nonMutualsYouFollow } = nonMutuals;
const { ghosts, setIsCheckingGhosts } = useGhostStore();
const { ghostDetectionBatchSize } = useSettingsStore();
const { updateGhosts } = useCacheManager();
useEffect(() => {
const detectGhosts = async () => {
if (!isNetworkReady || !accessToken) {
setIsCheckingGhosts(false);
return;
}
setIsCheckingGhosts(true);
const potentialGhosts = [
...nonMutualsYouFollow,
...nonMutualsFollowingYou,
].filter(
(user) =>
user?.followers.totalCount === 0 && user?.following.totalCount === 0
);
const newPotentialGhosts = potentialGhosts.filter(
(potentialGhost) =>
!ghosts.some(
(existingGhost) => existingGhost.login === potentialGhost.login
)
);
if (newPotentialGhosts.length === 0) {
setIsCheckingGhosts(false);
return;
}
const confirmedGhosts = [];
for (
let i = 0;
i < newPotentialGhosts.length;
i += ghostDetectionBatchSize
) {
const batch = newPotentialGhosts.slice(i, i + ghostDetectionBatchSize);
const usernames = batch.map((user) => user?.login);
try {
const response = await fetch('/api/verify-ghosts', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ usernames }),
});
if (response.ok) {
const { ghosts: ghostUsernames } = await response.json();
const batchGhosts = batch.filter((user) =>
ghostUsernames.includes(user?.login)
);
confirmedGhosts.push(...batchGhosts);
}
} catch (error) {
console.error('Error verifying ghost batch:', error);
}
if (i + ghostDetectionBatchSize < newPotentialGhosts.length) {
await new Promise((resolve) =>
setTimeout(resolve, DELAY_BETWEEN_BATCHES)
);
}
}
if (confirmedGhosts.length > 0) {
await updateGhosts(confirmedGhosts);
}
setIsCheckingGhosts(false);
};
detectGhosts();
}, [
isNetworkReady,
nonMutualsFollowingYou,
nonMutualsYouFollow,
accessToken,
ghosts,
setIsCheckingGhosts,
ghostDetectionBatchSize,
updateGhosts,
]);
};
import { create } from 'zustand';
import { PAGE_SIZE_LIST } from '../constants';
export type SettingsState = {
isSettingsModalOpen: boolean;
showAvatars: boolean;
ghostDetectionBatchSize: number;
paginationPageSize: number;
customStaleTime: number | null;
};
export type SettingsActions = {
toggleSettingsModal: () => void;
setShowAvatars: (show: boolean) => void;
setGhostDetectionBatchSize: (size: number) => void;
setPaginationPageSize: (size: number) => void;
setCustomStaleTime: (time: number | null) => void;
saveSettings: (
accessToken: string,
persistChanges: (accessToken: string) => Promise<void>
) => Promise<void>;
};
export type SettingsStore = SettingsState & SettingsActions;
export const useSettingsStore = create<SettingsStore>((set) => ({
isSettingsModalOpen: false,
showAvatars: true,
ghostDetectionBatchSize: 10,
paginationPageSize: PAGE_SIZE_LIST[0],
customStaleTime: null,
toggleSettingsModal: () =>
set((state) => ({ isSettingsModalOpen: !state.isSettingsModalOpen })),
setShowAvatars: (show) => set({ showAvatars: show }),
setGhostDetectionBatchSize: (size) => set({ ghostDetectionBatchSize: size }),
setPaginationPageSize: (size) => set({ paginationPageSize: size }),
setCustomStaleTime: (time) => set({ customStaleTime: time }),
saveSettings: async (accessToken, persistChanges) => {
await persistChanges(accessToken);
},
}));
import { create } from 'zustand';
import { PAGE_SIZE_LIST } from '../constants';
export type SettingsState = {
isSettingsModalOpen: boolean;
showAvatars: boolean;
ghostDetectionBatchSize: number;
paginationPageSize: number;
customStaleTime: number | null;
};
export type SettingsActions = {
toggleSettingsModal: () => void;
setShowAvatars: (show: boolean) => void;
setGhostDetectionBatchSize: (size: number) => void;
setPaginationPageSize: (size: number) => void;
setCustomStaleTime: (time: number | null) => void;
saveSettings: (
accessToken: string,
persistChanges: (accessToken: string) => Promise<void>
) => Promise<void>;
};
export type SettingsStore = SettingsState & SettingsActions;
export const useSettingsStore = create<SettingsStore>((set) => ({
isSettingsModalOpen: false,
showAvatars: true,
ghostDetectionBatchSize: 10,
paginationPageSize: PAGE_SIZE_LIST[0],
customStaleTime: null,
toggleSettingsModal: () =>
set((state) => ({ isSettingsModalOpen: !state.isSettingsModalOpen })),
setShowAvatars: (show) => set({ showAvatars: show }),
setGhostDetectionBatchSize: (size) => set({ ghostDetectionBatchSize: size }),
setPaginationPageSize: (size) => set({ paginationPageSize: size }),
setCustomStaleTime: (time) => set({ customStaleTime: time }),
saveSettings: async (accessToken, persistChanges) => {
await persistChanges(accessToken);
},
}));
import { GraphQLClient } from 'graphql-request';
import { toast } from 'sonner';
import { useCallback } from 'react';
import { useNetworkStore } from '@/lib/store/network';
import { useGistStore } from '@/lib/store/gist';
import { useGhostStore } from '@/lib/store/ghost';
import { useSettingsStore } from '@/lib/store/settings';
import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
GIST_ID_STORAGE_KEY,
STALE_TIME_LARGE,
STALE_TIME_MANUAL_ONLY,
STALE_TIME_MEDIUM,
STALE_TIME_SMALL,
} from '@/lib/constants';
import { CachedData, ProgressCallbacks } from '@/lib/types';
import { UserInfoFragment } from '@/lib/gql/types';
import { useSession } from 'next-auth/react';
export const useCacheManager = () => {
const setNetwork = useNetworkStore((state) => state.setNetwork);
const { setGhosts, addGhosts } = useGhostStore();
const { setGistName, setGistData, setTimestamp } = useGistStore();
const settings = useSettingsStore();
const { data } = useSession();
const accessToken = data?.accessToken;
const loadFromCache = useCallback(
(cachedData: CachedData) => {
setNetwork(cachedData.network);
setGhosts(cachedData.ghosts);
setGistData({
timestamp: cachedData.timestamp,
metadata: cachedData.metadata,
});
if (cachedData.settings) {
settings.setShowAvatars(cachedData.settings.showAvatars);
settings.setGhostDetectionBatchSize(
cachedData.settings.ghostDetectionBatchSize
);
settings.setPaginationPageSize(cachedData.settings.paginationPageSize);
settings.setCustomStaleTime(cachedData.settings.customStaleTime);
}
},
[setNetwork, setGhosts, setGistData, settings]
);
const initializeAndFetchNetwork = useCallback(
async (
client: GraphQLClient,
username: string,
accessToken: string,
progress: ProgressCallbacks
) => {
const { show, update, complete, fail } = progress;
const localGistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
setGistName(localGistName);
const isForced = useGistStore.getState().forceNextRefresh;
const currentGistName = useGistStore.getState().gistName;
if (isForced) {
useGistStore.getState().setForceNextRefresh(false);
}
if (!isForced) {
const foundGist = await findCacheGist(client, currentGistName);
if (foundGist) {
const cachedData = parseCache(foundGist);
if (cachedData) {
setGistName(foundGist.name);
const totalConnections = cachedData.metadata.totalConnections;
const { customStaleTime } = settings;
let staleTime = 0;
if (customStaleTime) {
staleTime = customStaleTime * 60 * 1000;
} else if (totalConnections <= 2000) {
staleTime = STALE_TIME_SMALL;
} else if (totalConnections <= 10000) {
staleTime = STALE_TIME_MEDIUM;
} else if (totalConnections <= 50000) {
staleTime = STALE_TIME_LARGE;
} else {
staleTime = STALE_TIME_MANUAL_ONLY;
}
const isStale = Date.now() - cachedData.timestamp > staleTime;
loadFromCache(cachedData);
if (!isStale) {
toast.info('Loaded fresh data from cache.');
return cachedData.network;
}
if (staleTime === STALE_TIME_MANUAL_ONLY) {
toast.info(
'Data loaded from cache. Refresh manually for the latest update.'
);
return cachedData.network;
}
}
}
}
const fetchStart = performance.now();
show({
title: 'Syncing Your Network',
message: 'Fetching connections from GitHub...',
items: [
{ label: 'Followers', current: 0, total: 0 },
{ label: 'Following', current: 0, total: 0 },
],
});
try {
const networkData = await fetchAllUserFollowersAndFollowing({
client,
username,
onProgress: (p) => {
update([
{
label: 'Followers',
current: p.fetchedFollowers,
total: p.totalFollowers,
},
{
label: 'Following',
current: p.fetchedFollowing,
total: p.totalFollowing,
},
]);
},
});
const fetchEnd = performance.now();
const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
const followers = networkData.followers.nodes as UserInfoFragment[];
const following = networkData.following.nodes as UserInfoFragment[];
const network = { followers, following };
const timestamp = Date.now();
const dataToCache: CachedData = {
network,
ghosts: [],
settings,
timestamp,
metadata: {
totalConnections: followers.length + following.length,
fetchDuration,
cacheVersion: '1.0',
},
};
const newGist = await writeCache(
accessToken,
dataToCache,
currentGistName
);
setNetwork(network);
setGistData({ timestamp, metadata: dataToCache.metadata });
setGistName(newGist.id);
complete();
return network;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} catch (err: any) {
fail({ message: err.message || 'Failed to sync network.' });
throw err;
}
},
[settings, loadFromCache, setGistName, setNetwork, setGistData]
);
const persistChanges = useCallback(async () => {
if (!accessToken) return;
const { network } = useNetworkStore.getState();
const { ghosts } = useGhostStore.getState();
const { metadata, gistName } = useGistStore.getState();
const currentSettings = useSettingsStore.getState();
if (!network || !metadata) return;
const newTimestamp = Date.now();
setTimestamp(newTimestamp);
const dataToCache: CachedData = {
network,
ghosts,
settings: currentSettings,
timestamp: newTimestamp,
metadata,
};
await writeCache(accessToken, dataToCache, gistName);
}, [accessToken, setTimestamp]);
const updateGhosts = useCallback(
async (newGhosts: UserInfoFragment[]) => {
addGhosts(newGhosts);
await persistChanges();
},
[addGhosts, persistChanges]
);
return {
initializeAndFetchNetwork,
loadFromCache,
persistChanges,
updateGhosts,
};
};
import { GraphQLClient } from 'graphql-request';
import { toast } from 'sonner';
import { useCallback } from 'react';
import { useNetworkStore } from '@/lib/store/network';
import { useGistStore } from '@/lib/store/gist';
import { useGhostStore } from '@/lib/store/ghost';
import { useSettingsStore } from '@/lib/store/settings';
import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
GIST_ID_STORAGE_KEY,
STALE_TIME_LARGE,
STALE_TIME_MANUAL_ONLY,
STALE_TIME_MEDIUM,
STALE_TIME_SMALL,
} from '@/lib/constants';
import { CachedData, ProgressCallbacks } from '@/lib/types';
import { UserInfoFragment } from '@/lib/gql/types';
import { useSession } from 'next-auth/react';
export const useCacheManager = () => {
const setNetwork = useNetworkStore((state) => state.setNetwork);
const { setGhosts, addGhosts } = useGhostStore();
const { setGistName, setGistData, setTimestamp } = useGistStore();
const settings = useSettingsStore();
const { data } = useSession();
const accessToken = data?.accessToken;
const loadFromCache = useCallback(
(cachedData: CachedData) => {
setNetwork(cachedData.network);
setGhosts(cachedData.ghosts);
setGistData({
timestamp: cachedData.timestamp,
metadata: cachedData.metadata,
});
if (cachedData.settings) {
settings.setShowAvatars(cachedData.settings.showAvatars);
settings.setGhostDetectionBatchSize(
cachedData.settings.ghostDetectionBatchSize
);
settings.setPaginationPageSize(cachedData.settings.paginationPageSize);
settings.setCustomStaleTime(cachedData.settings.customStaleTime);
}
},
[setNetwork, setGhosts, setGistData, settings]
);
const initializeAndFetchNetwork = useCallback(
async (
client: GraphQLClient,
username: string,
accessToken: string,
progress: ProgressCallbacks
) => {
const { show, update, complete, fail } = progress;
const localGistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
setGistName(localGistName);
const isForced = useGistStore.getState().forceNextRefresh;
const currentGistName = useGistStore.getState().gistName;
if (isForced) {
useGistStore.getState().setForceNextRefresh(false);
}
if (!isForced) {
const foundGist = await findCacheGist(client, currentGistName);
if (foundGist) {
const cachedData = parseCache(foundGist);
if (cachedData) {
setGistName(foundGist.name);
const totalConnections = cachedData.metadata.totalConnections;
const { customStaleTime } = settings;
let staleTime = 0;
if (customStaleTime) {
staleTime = customStaleTime * 60 * 1000;
} else if (totalConnections <= 2000) {
staleTime = STALE_TIME_SMALL;
} else if (totalConnections <= 10000) {
staleTime = STALE_TIME_MEDIUM;
} else if (totalConnections <= 50000) {
staleTime = STALE_TIME_LARGE;
} else {
staleTime = STALE_TIME_MANUAL_ONLY;
}
const isStale = Date.now() - cachedData.timestamp > staleTime;
loadFromCache(cachedData);
if (!isStale) {
toast.info('Loaded fresh data from cache.');
return cachedData.network;
}
if (staleTime === STALE_TIME_MANUAL_ONLY) {
toast.info(
'Data loaded from cache. Refresh manually for the latest update.'
);
return cachedData.network;
}
}
}
}
const fetchStart = performance.now();
show({
title: 'Syncing Your Network',
message: 'Fetching connections from GitHub...',
items: [
{ label: 'Followers', current: 0, total: 0 },
{ label: 'Following', current: 0, total: 0 },
],
});
try {
const networkData = await fetchAllUserFollowersAndFollowing({
client,
username,
onProgress: (p) => {
update([
{
label: 'Followers',
current: p.fetchedFollowers,
total: p.totalFollowers,
},
{
label: 'Following',
current: p.fetchedFollowing,
total: p.totalFollowing,
},
]);
},
});
const fetchEnd = performance.now();
const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
const followers = networkData.followers.nodes as UserInfoFragment[];
const following = networkData.following.nodes as UserInfoFragment[];
const network = { followers, following };
const timestamp = Date.now();
const dataToCache: CachedData = {
network,
ghosts: [],
settings,
timestamp,
metadata: {
totalConnections: followers.length + following.length,
fetchDuration,
cacheVersion: '1.0',
},
};
const newGist = await writeCache(
accessToken,
dataToCache,
currentGistName
);
setNetwork(network);
setGistData({ timestamp, metadata: dataToCache.metadata });
setGistName(newGist.id);
complete();
return network;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} catch (err: any) {
fail({ message: err.message || 'Failed to sync network.' });
throw err;
}
},
[settings, loadFromCache, setGistName, setNetwork, setGistData]
);
const persistChanges = useCallback(async () => {
if (!accessToken) return;
const { network } = useNetworkStore.getState();
const { ghosts } = useGhostStore.getState();
const { metadata, gistName } = useGistStore.getState();
const currentSettings = useSettingsStore.getState();
if (!network || !metadata) return;
const newTimestamp = Date.now();
setTimestamp(newTimestamp);
const dataToCache: CachedData = {
network,
ghosts,
settings: currentSettings,
timestamp: newTimestamp,
metadata,
};
await writeCache(accessToken, dataToCache, gistName);
}, [accessToken, setTimestamp]);
const updateGhosts = useCallback(
async (newGhosts: UserInfoFragment[]) => {
addGhosts(newGhosts);
await persistChanges();
},
[addGhosts, persistChanges]
);
return {
initializeAndFetchNetwork,
loadFromCache,
persistChanges,
updateGhosts,
};
};