A serverless GitHub manager for finding non-mutuals. Uses a client-heavy architecture with GitHub Gists as a database and adaptive caching to respect API limits.
After a period of intense work through 2024, I took a necessary step back from development in early 2025. When I was ready to dive back in a few months later, I found my GitHub account buzzing with activity. My network had grown significantly, and I had no easy way to manage it or understand my connections.
My search for a solution led me to some existing tools, like Hesbon-Osoro's follow-for-follow-back project. It was a great starting point, but I quickly identified its limitations. To unfollow someone, I was redirected to their GitHub page to click the button manually, and there was no way to perform bulk actions. This friction sparked an idea. I would build my own tool, not just to solve my own problem, but also as the perfect challenge to reignite my passion for coding.
I had two core principles for this project: the architecture had to be simple, and it had to be free to run. I also knew that for a tool like this to be truly useful, it couldn't rely on a shared, easily exhausted API rate limit. The first major decision was to use a GitHub OAuth App, giving every user their own 5,000 requests/hour quota to work with.
My initial impulse was to use Next.js's server capabilities to fetch a user's entire network. However, I quickly realized this would be a dead end for power users. A full network sync for an account with tens of thousands of connections could easily exceed the execution limits on Vercel's free tier. Since the app's primary objective was completeness (you need the full picture to accurately calculate non-mutuals), a partial fetch wasn't an option. This forced a crucial pivot:
With the fetching strategy decided, I needed a place to cache the processed data. As I explored options, the solution hit me: why not use GitHub itself? I could leverage GitHub Gists as a user-owned database. For state management, I chose a combination of React Query for server state and Zustand for client state.
A serverless GitHub manager for finding non-mutuals. Uses a client-heavy architecture with GitHub Gists as a database and adaptive caching to respect API limits.
After a period of intense work through 2024, I took a necessary step back from development in early 2025. When I was ready to dive back in a few months later, I found my GitHub account buzzing with activity. My network had grown significantly, and I had no easy way to manage it or understand my connections.
My search for a solution led me to some existing tools, like Hesbon-Osoro's follow-for-follow-back project. It was a great starting point, but I quickly identified its limitations. To unfollow someone, I was redirected to their GitHub page to click the button manually, and there was no way to perform bulk actions. This friction sparked an idea. I would build my own tool, not just to solve my own problem, but also as the perfect challenge to reignite my passion for coding.
I had two core principles for this project: the architecture had to be simple, and it had to be free to run. I also knew that for a tool like this to be truly useful, it couldn't rely on a shared, easily exhausted API rate limit. The first major decision was to use a GitHub OAuth App, giving every user their own 5,000 requests/hour quota to work with.
My initial impulse was to use Next.js's server capabilities to fetch a user's entire network. However, I quickly realized this would be a dead end for power users. A full network sync for an account with tens of thousands of connections could easily exceed the execution limits on Vercel's free tier. Since the app's primary objective was completeness (you need the full picture to accurately calculate non-mutuals), a partial fetch wasn't an option. This forced a crucial pivot:
With the fetching strategy decided, I needed a place to cache the processed data. As I explored options, the solution hit me: why not use GitHub itself? I could leverage GitHub Gists as a user-owned database. For state management, I chose a combination of React Query for server state and Zustand for client state.
Using Gists as a database is the cornerstone of the project's "GitHub-as-Infrastructure" approach. The benefits are significant:
However, this approach requires accepting certain trade-offs, which can be viewed through the lens of the CAP theorem's principles: Consistency, Availability, and Partition Tolerance. For a tool like FollowSync, you can only optimize for two of these three. The app prioritizes:
It does this by sacrificing strict Consistency (or freshness). The data in the cache is a snapshot, not a real-time reflection of a user's GitHub network. This trade-off is the key to providing a fast, resilient, and free service.
View the full mermaid diagram here.
With the foundation laid, I ran into three interesting technical challenges that defined the project.
A "one-size-fits-all" caching strategy was never going to work. The solution was an adaptive, stale-while-revalidate caching system. The application first loads instantly from the Gist cache, then checks the cache's timestamp against a stale time that varies based on the user's network size.
To understand why this was critical, let's look at the numbers. GitHub's GraphQL API calculates rate limit costs based on the number of nodes requested, roughly 1 point per 100 nodes in the larger of the follower/following lists.
This calculation makes it clear that a dynamic approach based on network size isn't just a nice-to-have; it's essential for the tool to function for its target power users.
While testing, I stumbled upon "ghost" accounts: deleted users who still appear in follower lists and lead to a 404 page. The GitHub API provides no way to detect them. The key insight was two-fold: ghosts primarily matter when they are non-mutuals, and they almost always have 0 followers and 0 following.
The final solution is a multi-step process:
A core goal was to eliminate friction by making actions instant and in-app. When a user follows or unfollows someone, I use an optimistic update. The UI reflects the change immediately, while the API call happens in the background. If it fails, React Query seamlessly reverts the UI and displays an error. I extended this pattern to bulk operations with a custom useBulkOperation hook, which iterates through a list of users, reports progress in real-time, and adds a small delay between requests to respect API limits.
The combination of React Query and Zustand is the engine of the application's client-side. They work in perfect harmony because their roles are clearly defined:
This synergy is best seen in the useNetworkData hook. The hook itself is a simple wrapper around React Query's useQuery. However, the queryFn it calls, initializeAndFetchNetwork, is an action from the Zustand store. This creates a clean separation where Zustand decides when and what to fetch, and React Query handles the how.
Building FollowSync was an exercise in pragmatic engineering and a rewarding way to jump back into development.
While the core features are complete, the project is far from over. Next on the roadmap is adding user-configurable settings for cache stale times, giving power users even more control over how the application behaves.
Using Gists as a database is the cornerstone of the project's "GitHub-as-Infrastructure" approach. The benefits are significant:
However, this approach requires accepting certain trade-offs, which can be viewed through the lens of the CAP theorem's principles: Consistency, Availability, and Partition Tolerance. For a tool like FollowSync, you can only optimize for two of these three. The app prioritizes:
It does this by sacrificing strict Consistency (or freshness). The data in the cache is a snapshot, not a real-time reflection of a user's GitHub network. This trade-off is the key to providing a fast, resilient, and free service.
View the full mermaid diagram here.
With the foundation laid, I ran into three interesting technical challenges that defined the project.
A "one-size-fits-all" caching strategy was never going to work. The solution was an adaptive, stale-while-revalidate caching system. The application first loads instantly from the Gist cache, then checks the cache's timestamp against a stale time that varies based on the user's network size.
To understand why this was critical, let's look at the numbers. GitHub's GraphQL API calculates rate limit costs based on the number of nodes requested, roughly 1 point per 100 nodes in the larger of the follower/following lists.
This calculation makes it clear that a dynamic approach based on network size isn't just a nice-to-have; it's essential for the tool to function for its target power users.
While testing, I stumbled upon "ghost" accounts: deleted users who still appear in follower lists and lead to a 404 page. The GitHub API provides no way to detect them. The key insight was two-fold: ghosts primarily matter when they are non-mutuals, and they almost always have 0 followers and 0 following.
The final solution is a multi-step process:
A core goal was to eliminate friction by making actions instant and in-app. When a user follows or unfollows someone, I use an optimistic update. The UI reflects the change immediately, while the API call happens in the background. If it fails, React Query seamlessly reverts the UI and displays an error. I extended this pattern to bulk operations with a custom useBulkOperation hook, which iterates through a list of users, reports progress in real-time, and adds a small delay between requests to respect API limits.
The combination of React Query and Zustand is the engine of the application's client-side. They work in perfect harmony because their roles are clearly defined:
This synergy is best seen in the useNetworkData hook. The hook itself is a simple wrapper around React Query's useQuery. However, the queryFn it calls, initializeAndFetchNetwork, is an action from the Zustand store. This creates a clean separation where Zustand decides when and what to fetch, and React Query handles the how.
Building FollowSync was an exercise in pragmatic engineering and a rewarding way to jump back into development.
While the core features are complete, the project is far from over. Next on the roadmap is adding user-configurable settings for cache stale times, giving power users even more control over how the application behaves.
import { create } from 'zustand';
import { toast } from 'sonner';
import { getNonMutuals } from '@/lib/utils';
import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
GIST_ID_STORAGE_KEY,
STALE_TIME_LARGE,
STALE_TIME_MANUAL_ONLY,
STALE_TIME_MEDIUM,
STALE_TIME_SMALL,
} from '@/lib/constants';
import { UserInfoFragment } from '@/lib/gql/types';
import { CachedData } from '@/lib/types';
//other type definitions
export type CacheStoreState = {
network: {
followers: UserInfoFragment[];
following: UserInfoFragment[];
};
nonMutuals: {
nonMutualsFollowingYou: UserInfoFragment[];
nonMutualsYouFollow: UserInfoFragment[];
};
ghosts: UserInfoFragment[];
ghostsSet: Set<string>;
timestamp: number | null;
isCheckingGhosts: boolean;
gistName: string | null;
metadata: CachedData['metadata'] | null;
forceNextRefresh: boolean;
};
const initialState: CacheStoreState = {
network: { followers: [], following: [] },
nonMutuals: { nonMutualsFollowingYou: [], nonMutualsYouFollow: [] },
ghosts: [],
ghostsSet: new Set(),
timestamp: null,
isCheckingGhosts: true,
gistName: null,
metadata: null,
forceNextRefresh: false,
};
export const useCacheStore = create<CacheStore>((set, get) => ({
...initialState,
// other actions
initializeAndFetchNetwork: async (
client,
username,
accessToken,
progress
) => {
const { show, update, complete, fail } = progress;
const gistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
set({ gistName });
const forceRefresh = get().forceNextRefresh;
if (forceRefresh) {
get().setForceNextRefresh(false); // Reset the flag
}
if (!forceRefresh) {
const foundGist = await findCacheGist(client, gistName);
if (foundGist) {
const cachedData = parseCache(foundGist);
if (cachedData) {
get().setGistName(foundGist.name);
const totalConnections = cachedData.metadata.totalConnections;
let staleTime = 0;
if (totalConnections < 2000) {
staleTime = STALE_TIME_SMALL;
} else if (totalConnections < 10000) {
staleTime = STALE_TIME_MEDIUM;
} else if (totalConnections < 50000) {
staleTime = STALE_TIME_LARGE;
} else {
staleTime = STALE_TIME_MANUAL_ONLY;
}
const isStale = Date.now() - cachedData.timestamp > staleTime;
get().loadFromCache(cachedData);
if (!isStale) {
console.log('Cache is fresh, returning data.');
toast.info('Loaded fresh data from cache.');
return cachedData.network;
}
// For manual-only tier, show a different message
if (staleTime === STALE_TIME_MANUAL_ONLY) {
console.warn(
'Cache is stale, but auto-refresh is disabled for large networks. Awaiting manual refresh.'
);
toast.info(
'Data loaded from cache. Refresh manually for the latest update.'
);
return cachedData.network;
}
console.warn(
'Cache is stale, fetching fresh data in the background...'
);
}
}
}
const fetchStart = performance.now();
show({
title: 'Syncing Your Network',
message: 'Fetching connections from GitHub...',
items: [
{ label: 'Followers', current: 0, total: 0 },
{ label: 'Following', current: 0, total: 0 },
],
});
try {
const networkData = await fetchAllUserFollowersAndFollowing({
client,
username,
onProgress: (p) => {
update([
{
label: 'Followers',
current: p.fetchedFollowers,
total: p.totalFollowers,
},
{
label: 'Following',
current: p.fetchedFollowing,
total: p.totalFollowing,
},
]);
},
});
const fetchEnd = performance.now();
const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
const followers = networkData.followers.nodes as UserInfoFragment[];
const following = networkData.following.nodes as UserInfoFragment[];
const network = { followers, following };
const timestamp = Date.now();
const dataToCache: CachedData = {
network,
ghosts: [],
timestamp,
metadata: {
totalConnections: followers.length + following.length,
fetchDuration,
cacheVersion: '1.0',
},
};
const newGist = await writeCache(
accessToken,
dataToCache,
get().gistName
);
set({
network,
timestamp,
nonMutuals: getNonMutuals(network),
metadata: dataToCache.metadata,
});
get().setGistName(newGist.id);
complete();
return network;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} catch (err: any) {
fail({ message: err.message || 'Failed to sync network.' });
throw err;
}
},
// other actions
}));
import { useSession } from 'next-auth/react';
import { useMutation } from '@tanstack/react-query';
import { toast } from 'sonner';
import { useCacheStore } from '@/lib/store/cache';
import { followUser, unfollowUser } from '@/lib/gql/fetchers';
import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
import { getNonMutuals } from '@/lib/utils';
import { UserInfoFragment } from '@/lib/gql/types';
import { CachedData } from '@/lib/types';
import { useModalStore } from '@/lib/store/modal';
export const useFollowManager = () => {
const { client } = useClientAuthenticatedGraphQLClient();
const { data: session } = useSession();
const { getState, gistName, writeCache, updateNetwork } = useCacheStore();
const { incrementActionCount } = useModalStore();
const persistChanges = async () => {
if (!session?.accessToken) return;
const currentState = getState();
const { network, ghosts, metadata } = currentState;
if (!network) return;
const dataToCache: CachedData = {
network,
ghosts,
timestamp: Date.now(),
metadata: {
totalConnections: network.followers.length + network.following.length,
fetchDuration: metadata?.fetchDuration || 0,
cacheVersion: metadata?.cacheVersion || '1.0',
},
};
await writeCache(session.accessToken, dataToCache, gistName);
};
const followMutation = useMutation({
mutationFn: (userToFollow: UserInfoFragment) => {
if (!client) throw new Error('GraphQL client not available');
return followUser({ client, userId: userToFollow.id });
},
onMutate: async (userToFollow: UserInfoFragment) => {
const previousState = getState();
const newFollowing = [...previousState.network.following, userToFollow];
const newNetwork = { ...previousState.network, following: newFollowing };
const newNonMutuals = getNonMutuals(newNetwork);
updateNetwork({ network: newNetwork, nonMutuals: newNonMutuals });
return { previousState };
},
onError: (err, userToFollow, context) => {
if (context?.previousState) {
updateNetwork({
network: context.previousState.network,
nonMutuals: context.previousState.nonMutuals,
});
}
toast.error(`Failed to follow @${userToFollow.login}: ${err.message}`);
},
});
const unfollowMutation = useMutation({
mutationFn: (userToUnfollow: UserInfoFragment) => {
if (!client) throw new Error('GraphQL client not available');
return unfollowUser({ client, userId: userToUnfollow.id });
},
onMutate: async (userToUnfollow: UserInfoFragment) => {
const previousState = getState();
const newFollowing = previousState.network.following.filter(
(u) => u.id !== userToUnfollow.id
);
const newNetwork = { ...previousState.network, following: newFollowing };
const newNonMutuals = getNonMutuals(newNetwork);
updateNetwork({ network: newNetwork, nonMutuals: newNonMutuals });
return { previousState };
},
onError: (err, userToUnfollow, context) => {
if (context?.previousState) {
updateNetwork({
network: context.previousState.network,
nonMutuals: context.previousState.nonMutuals,
});
}
toast.error(
`Failed to unfollow @${userToUnfollow.login}: ${err.message}`
);
},
});
return {
followMutation,
unfollowMutation,
persistChanges,
incrementActionCount,
};
};
import { create } from 'zustand';
import { toast } from 'sonner';
import { getNonMutuals } from '@/lib/utils';
import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
GIST_ID_STORAGE_KEY,
STALE_TIME_LARGE,
STALE_TIME_MANUAL_ONLY,
STALE_TIME_MEDIUM,
STALE_TIME_SMALL,
} from '@/lib/constants';
import { UserInfoFragment } from '@/lib/gql/types';
import { CachedData } from '@/lib/types';
//other type definitions
export type CacheStoreState = {
network: {
followers: UserInfoFragment[];
following: UserInfoFragment[];
};
nonMutuals: {
nonMutualsFollowingYou: UserInfoFragment[];
nonMutualsYouFollow: UserInfoFragment[];
};
ghosts: UserInfoFragment[];
ghostsSet: Set<string>;
timestamp: number | null;
isCheckingGhosts: boolean;
gistName: string | null;
metadata: CachedData['metadata'] | null;
forceNextRefresh: boolean;
};
const initialState: CacheStoreState = {
network: { followers: [], following: [] },
nonMutuals: { nonMutualsFollowingYou: [], nonMutualsYouFollow: [] },
ghosts: [],
ghostsSet: new Set(),
timestamp: null,
isCheckingGhosts: true,
gistName: null,
metadata: null,
forceNextRefresh: false,
};
export const useCacheStore = create<CacheStore>((set, get) => ({
...initialState,
// other actions
initializeAndFetchNetwork: async (
client,
username,
accessToken,
progress
) => {
const { show, update, complete, fail } = progress;
const gistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
set({ gistName });
const forceRefresh = get().forceNextRefresh;
if (forceRefresh) {
get().setForceNextRefresh(false); // Reset the flag
}
if (!forceRefresh) {
const foundGist = await findCacheGist(client, gistName);
if (foundGist) {
const cachedData = parseCache(foundGist);
if (cachedData) {
get().setGistName(foundGist.name);
const totalConnections = cachedData.metadata.totalConnections;
let staleTime = 0;
if (totalConnections < 2000) {
staleTime = STALE_TIME_SMALL;
} else if (totalConnections < 10000) {
staleTime = STALE_TIME_MEDIUM;
} else if (totalConnections < 50000) {
staleTime = STALE_TIME_LARGE;
} else {
staleTime = STALE_TIME_MANUAL_ONLY;
}
const isStale = Date.now() - cachedData.timestamp > staleTime;
get().loadFromCache(cachedData);
if (!isStale) {
console.log('Cache is fresh, returning data.');
toast.info('Loaded fresh data from cache.');
return cachedData.network;
}
// For manual-only tier, show a different message
if (staleTime === STALE_TIME_MANUAL_ONLY) {
console.warn(
'Cache is stale, but auto-refresh is disabled for large networks. Awaiting manual refresh.'
);
toast.info(
'Data loaded from cache. Refresh manually for the latest update.'
);
return cachedData.network;
}
console.warn(
'Cache is stale, fetching fresh data in the background...'
);
}
}
}
const fetchStart = performance.now();
show({
title: 'Syncing Your Network',
message: 'Fetching connections from GitHub...',
items: [
{ label: 'Followers', current: 0, total: 0 },
{ label: 'Following', current: 0, total: 0 },
],
});
try {
const networkData = await fetchAllUserFollowersAndFollowing({
client,
username,
onProgress: (p) => {
update([
{
label: 'Followers',
current: p.fetchedFollowers,
total: p.totalFollowers,
},
{
label: 'Following',
current: p.fetchedFollowing,
total: p.totalFollowing,
},
]);
},
});
const fetchEnd = performance.now();
const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
const followers = networkData.followers.nodes as UserInfoFragment[];
const following = networkData.following.nodes as UserInfoFragment[];
const network = { followers, following };
const timestamp = Date.now();
const dataToCache: CachedData = {
network,
ghosts: [],
timestamp,
metadata: {
totalConnections: followers.length + following.length,
fetchDuration,
cacheVersion: '1.0',
},
};
const newGist = await writeCache(
accessToken,
dataToCache,
get().gistName
);
set({
network,
timestamp,
nonMutuals: getNonMutuals(network),
metadata: dataToCache.metadata,
});
get().setGistName(newGist.id);
complete();
return network;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} catch (err: any) {
fail({ message: err.message || 'Failed to sync network.' });
throw err;
}
},
// other actions
}));
import { useSession } from 'next-auth/react';
import { useMutation } from '@tanstack/react-query';
import { toast } from 'sonner';
import { useCacheStore } from '@/lib/store/cache';
import { followUser, unfollowUser } from '@/lib/gql/fetchers';
import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
import { getNonMutuals } from '@/lib/utils';
import { UserInfoFragment } from '@/lib/gql/types';
import { CachedData } from '@/lib/types';
import { useModalStore } from '@/lib/store/modal';
export const useFollowManager = () => {
const { client } = useClientAuthenticatedGraphQLClient();
const { data: session } = useSession();
const { getState, gistName, writeCache, updateNetwork } = useCacheStore();
const { incrementActionCount } = useModalStore();
const persistChanges = async () => {
if (!session?.accessToken) return;
const currentState = getState();
const { network, ghosts, metadata } = currentState;
if (!network) return;
const dataToCache: CachedData = {
network,
ghosts,
timestamp: Date.now(),
metadata: {
totalConnections: network.followers.length + network.following.length,
fetchDuration: metadata?.fetchDuration || 0,
cacheVersion: metadata?.cacheVersion || '1.0',
},
};
await writeCache(session.accessToken, dataToCache, gistName);
};
const followMutation = useMutation({
mutationFn: (userToFollow: UserInfoFragment) => {
if (!client) throw new Error('GraphQL client not available');
return followUser({ client, userId: userToFollow.id });
},
onMutate: async (userToFollow: UserInfoFragment) => {
const previousState = getState();
const newFollowing = [...previousState.network.following, userToFollow];
const newNetwork = { ...previousState.network, following: newFollowing };
const newNonMutuals = getNonMutuals(newNetwork);
updateNetwork({ network: newNetwork, nonMutuals: newNonMutuals });
return { previousState };
},
onError: (err, userToFollow, context) => {
if (context?.previousState) {
updateNetwork({
network: context.previousState.network,
nonMutuals: context.previousState.nonMutuals,
});
}
toast.error(`Failed to follow @${userToFollow.login}: ${err.message}`);
},
});
const unfollowMutation = useMutation({
mutationFn: (userToUnfollow: UserInfoFragment) => {
if (!client) throw new Error('GraphQL client not available');
return unfollowUser({ client, userId: userToUnfollow.id });
},
onMutate: async (userToUnfollow: UserInfoFragment) => {
const previousState = getState();
const newFollowing = previousState.network.following.filter(
(u) => u.id !== userToUnfollow.id
);
const newNetwork = { ...previousState.network, following: newFollowing };
const newNonMutuals = getNonMutuals(newNetwork);
updateNetwork({ network: newNetwork, nonMutuals: newNonMutuals });
return { previousState };
},
onError: (err, userToUnfollow, context) => {
if (context?.previousState) {
updateNetwork({
network: context.previousState.network,
nonMutuals: context.previousState.nonMutuals,
});
}
toast.error(
`Failed to unfollow @${userToUnfollow.login}: ${err.message}`
);
},
});
return {
followMutation,
unfollowMutation,
persistChanges,
incrementActionCount,
};
};
import { useEffect } from 'react';
import { useSession } from 'next-auth/react';
import { useCacheStore } from '@/lib/store/cache';
const BATCH_SIZE = 10;
const DELAY_BETWEEN_BATCHES = 1000; // 1 second
export const useGhostDetector = () => {
const { data: session } = useSession();
const accessToken = session?.accessToken;
const {
nonMutuals: { nonMutualsYouFollow, nonMutualsFollowingYou },
ghosts,
setGhosts,
setIsCheckingGhosts,
} = useCacheStore();
useEffect(() => {
setIsCheckingGhosts(true);
const detectGhosts = async () => {
if (
ghosts.length > 0 ||
nonMutualsYouFollow.length === 0 ||
nonMutualsFollowingYou.length === 0 ||
!accessToken
) {
setIsCheckingGhosts(false);
return;
}
const potentialGhosts = [
...nonMutualsYouFollow,
...nonMutualsFollowingYou,
].filter(
(user) =>
user?.followers.totalCount === 0 && user?.following.totalCount === 0
);
if (potentialGhosts.length === 0) {
return;
}
const confirmedGhosts = [];
for (let i = 0; i < potentialGhosts.length; i += BATCH_SIZE) {
const batch = potentialGhosts.slice(i, i + BATCH_SIZE);
const usernames = batch.map((user) => user?.login);
try {
const response = await fetch('/api/verify-ghosts', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ usernames }),
});
if (response.ok) {
const { ghosts: ghostUsernames } = await response.json();
const batchGhosts = batch.filter((user) =>
ghostUsernames.includes(user?.login)
);
confirmedGhosts.push(...batchGhosts);
}
} catch (error) {
console.error('Error verifying ghost batch:', error);
}
if (i + BATCH_SIZE < potentialGhosts.length) {
await new Promise((resolve) =>
setTimeout(resolve, DELAY_BETWEEN_BATCHES)
);
}
}
setIsCheckingGhosts(false);
await setGhosts(confirmedGhosts, accessToken);
};
detectGhosts();
}, [
nonMutualsFollowingYou,
nonMutualsYouFollow,
accessToken,
setGhosts,
ghosts.length,
setIsCheckingGhosts,
]);
};
import { useEffect } from 'react';
import { useSession } from 'next-auth/react';
import { useCacheStore } from '@/lib/store/cache';
const BATCH_SIZE = 10;
const DELAY_BETWEEN_BATCHES = 1000; // 1 second
export const useGhostDetector = () => {
const { data: session } = useSession();
const accessToken = session?.accessToken;
const {
nonMutuals: { nonMutualsYouFollow, nonMutualsFollowingYou },
ghosts,
setGhosts,
setIsCheckingGhosts,
} = useCacheStore();
useEffect(() => {
setIsCheckingGhosts(true);
const detectGhosts = async () => {
if (
ghosts.length > 0 ||
nonMutualsYouFollow.length === 0 ||
nonMutualsFollowingYou.length === 0 ||
!accessToken
) {
setIsCheckingGhosts(false);
return;
}
const potentialGhosts = [
...nonMutualsYouFollow,
...nonMutualsFollowingYou,
].filter(
(user) =>
user?.followers.totalCount === 0 && user?.following.totalCount === 0
);
if (potentialGhosts.length === 0) {
return;
}
const confirmedGhosts = [];
for (let i = 0; i < potentialGhosts.length; i += BATCH_SIZE) {
const batch = potentialGhosts.slice(i, i + BATCH_SIZE);
const usernames = batch.map((user) => user?.login);
try {
const response = await fetch('/api/verify-ghosts', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ usernames }),
});
if (response.ok) {
const { ghosts: ghostUsernames } = await response.json();
const batchGhosts = batch.filter((user) =>
ghostUsernames.includes(user?.login)
);
confirmedGhosts.push(...batchGhosts);
}
} catch (error) {
console.error('Error verifying ghost batch:', error);
}
if (i + BATCH_SIZE < potentialGhosts.length) {
await new Promise((resolve) =>
setTimeout(resolve, DELAY_BETWEEN_BATCHES)
);
}
}
setIsCheckingGhosts(false);
await setGhosts(confirmedGhosts, accessToken);
};
detectGhosts();
}, [
nonMutualsFollowingYou,
nonMutualsYouFollow,
accessToken,
setGhosts,
ghosts.length,
setIsCheckingGhosts,
]);
};
import { useQuery } from '@tanstack/react-query';
import { useSession } from 'next-auth/react';
import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
import { QUERY_KEY_USER_NETWORK } from '@/lib/constants';
import { useCacheStore } from '@/lib/store/cache';
import { useProgress } from '@/lib/context/progress';
export const useNetworkData = (username?: string) => {
const { client, status: authStatus } = useClientAuthenticatedGraphQLClient();
const { data: session } = useSession();
const { initializeAndFetchNetwork, setForceNextRefresh } = useCacheStore();
const progress = useProgress();
const queryResult = useQuery({
queryKey: [QUERY_KEY_USER_NETWORK, username],
queryFn: async () => {
if (!client || !username || !session?.accessToken) {
throw new Error('Client, username, or session not available.');
}
const data = await initializeAndFetchNetwork(
client,
username,
session.accessToken,
progress
);
return data;
},
enabled: !!client && authStatus === 'authenticated' && !!session,
retry: false,
staleTime: Infinity,
refetchOnMount: false,
refetchOnWindowFocus: false,
});
const forceRefetch = async () => {
setForceNextRefresh(true);
await queryResult.refetch();
};
return { ...queryResult, refetch: forceRefetch };
};
import { useQuery } from '@tanstack/react-query';
import { useSession } from 'next-auth/react';
import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
import { QUERY_KEY_USER_NETWORK } from '@/lib/constants';
import { useCacheStore } from '@/lib/store/cache';
import { useProgress } from '@/lib/context/progress';
export const useNetworkData = (username?: string) => {
const { client, status: authStatus } = useClientAuthenticatedGraphQLClient();
const { data: session } = useSession();
const { initializeAndFetchNetwork, setForceNextRefresh } = useCacheStore();
const progress = useProgress();
const queryResult = useQuery({
queryKey: [QUERY_KEY_USER_NETWORK, username],
queryFn: async () => {
if (!client || !username || !session?.accessToken) {
throw new Error('Client, username, or session not available.');
}
const data = await initializeAndFetchNetwork(
client,
username,
session.accessToken,
progress
);
return data;
},
enabled: !!client && authStatus === 'authenticated' && !!session,
retry: false,
staleTime: Infinity,
refetchOnMount: false,
refetchOnWindowFocus: false,
});
const forceRefetch = async () => {
setForceNextRefresh(true);
await queryResult.refetch();
};
return { ...queryResult, refetch: forceRefetch };
};