Avatar
Home
Projects
Contact
Available For Work
HomeProjectsContact
Local Time (Africa/Juba)
Local Time (Africa/Juba)
HomeProjectsContact
©2025, All Rights ReservedBuilt with by Maged

No Server, No Database, No Problem: The Client-Heavy Architecture of Follow Sync

A serverless GitHub manager for finding non-mutuals. Uses a client-heavy architecture with GitHub Gists as a database and adaptive caching to respect API limits.

Jul, 2025Completedvercel.appSource
Carousel image (1)
Carousel image (2)
Carousel image (3)

Table of Contents

  • Introduction
  • The Technical Blueprint: Pivoting to the Client
  • The "GitHub-as-Infrastructure" Philosophy & Its Trade-offs
  • The Development Journey: Engineering for the Edge Cases
  • Challenge 1: Engineering a Rate-Limit-Aware Caching System
  • Snippet: cache.ts

Introduction

After a period of intense work through 2024, I took a necessary step back from development in early 2025. When I was ready to dive back in a few months later, I found my GitHub account buzzing with activity. My network had grown significantly, and I had no easy way to manage it or understand my connections.

My search for a solution led me to some existing tools, like Hesbon-Osoro's follow-for-follow-back project. It was a great starting point, but I quickly identified its limitations. To unfollow someone, I was redirected to their GitHub page to click the button manually, and there was no way to perform bulk actions. This friction sparked an idea. I would build my own tool, not just to solve my own problem, but also as the perfect challenge to reignite my passion for coding.

The Technical Blueprint: Pivoting to the Client

I had two core principles for this project: the architecture had to be simple, and it had to be free to run. I also knew that for a tool like this to be truly useful, it couldn't rely on a shared, easily exhausted API rate limit. The first major decision was to use a GitHub OAuth App, giving every user their own 5,000 requests/hour quota to work with.

My initial impulse was to use Next.js's server capabilities to fetch a user's entire network. However, I quickly realized this would be a dead end for power users. A full network sync for an account with tens of thousands of connections could easily exceed the execution limits on Vercel's free tier. Since the app's primary objective was completeness (you need the full picture to accurately calculate non-mutuals), a partial fetch wasn't an option. This forced a crucial pivot:

  • Follow Sync would be a client-heavy application. The user's own browser would handle the long-running data fetch, bypassing serverless timeouts entirely.

With the fetching strategy decided, I needed a place to cache the processed data. As I explored options, the solution hit me: why not use GitHub itself? I could leverage GitHub Gists as a user-owned database. For state management, I chose a combination of React Query for server state and Zustand for client state.

Share Project

No Server, No Database, No Problem: The Client-Heavy Architecture of Follow Sync

A serverless GitHub manager for finding non-mutuals. Uses a client-heavy architecture with GitHub Gists as a database and adaptive caching to respect API limits.

Jul, 2025Completedvercel.appSource
Carousel image (1)
Carousel image (2)
Carousel image (3)

Table of Contents

  • Introduction
  • The Technical Blueprint: Pivoting to the Client
  • The "GitHub-as-Infrastructure" Philosophy & Its Trade-offs
  • The Development Journey: Engineering for the Edge Cases
  • Challenge 1: Engineering a Rate-Limit-Aware Caching System
  • Snippet: cache.ts

Introduction

After a period of intense work through 2024, I took a necessary step back from development in early 2025. When I was ready to dive back in a few months later, I found my GitHub account buzzing with activity. My network had grown significantly, and I had no easy way to manage it or understand my connections.

My search for a solution led me to some existing tools, like Hesbon-Osoro's follow-for-follow-back project. It was a great starting point, but I quickly identified its limitations. To unfollow someone, I was redirected to their GitHub page to click the button manually, and there was no way to perform bulk actions. This friction sparked an idea. I would build my own tool, not just to solve my own problem, but also as the perfect challenge to reignite my passion for coding.

The Technical Blueprint: Pivoting to the Client

I had two core principles for this project: the architecture had to be simple, and it had to be free to run. I also knew that for a tool like this to be truly useful, it couldn't rely on a shared, easily exhausted API rate limit. The first major decision was to use a GitHub OAuth App, giving every user their own 5,000 requests/hour quota to work with.

My initial impulse was to use Next.js's server capabilities to fetch a user's entire network. However, I quickly realized this would be a dead end for power users. A full network sync for an account with tens of thousands of connections could easily exceed the execution limits on Vercel's free tier. Since the app's primary objective was completeness (you need the full picture to accurately calculate non-mutuals), a partial fetch wasn't an option. This forced a crucial pivot:

  • Follow Sync would be a client-heavy application. The user's own browser would handle the long-running data fetch, bypassing serverless timeouts entirely.

With the fetching strategy decided, I needed a place to cache the processed data. As I explored options, the solution hit me: why not use GitHub itself? I could leverage GitHub Gists as a user-owned database. For state management, I chose a combination of React Query for server state and Zustand for client state.

Share Project

Challenge 2: Detecting 'Ghost' Accounts via the Edge
  • Snippet: useGhostDetector.ts
  • Snippet: route.ts
  • Challenge 3: Building a Responsive UI with Optimistic Updates
  • Snippet: useFollowManager.ts
  • Code Spotlight: Orchestrating State with React Query and Zustand
  • Snippet: useNetworkManager.ts
  • Snippet: cache.ts
  • Lessons Learned and The Road Ahead
  • The "GitHub-as-Infrastructure" Philosophy & Its Trade-offs

    Using Gists as a database is the cornerstone of the project's "GitHub-as-Infrastructure" approach. The benefits are significant:

    • Zero Cost: It leverages existing infrastructure, completely avoiding server and database hosting fees.
    • User-Owned Data: The analysis cache is stored in a Gist owned by the user, ensuring they have full control and privacy.
    • Built-in Versioning: Gists automatically track revision history, which opens the door for future features like analyzing network changes over time.

    However, this approach requires accepting certain trade-offs, which can be viewed through the lens of the CAP theorem's principles: Consistency, Availability, and Partition Tolerance. For a tool like FollowSync, you can only optimize for two of these three. The app prioritizes:

    • Availability: The app is always available and loads instantly because it reads directly from the Gist cache on the user's machine.
    • Partition Tolerance: The app remains functional even if it can't reach the GitHub API. Users can still browse and analyze their last-synced network data.

    It does this by sacrificing strict Consistency (or freshness). The data in the cache is a snapshot, not a real-time reflection of a user's GitHub network. This trade-off is the key to providing a fast, resilient, and free service.

    Image(1)

    View the full mermaid diagram here.

    The Development Journey: Engineering for the Edge Cases

    With the foundation laid, I ran into three interesting technical challenges that defined the project.

    Challenge 1: Engineering a Rate-Limit-Aware Caching System

    A "one-size-fits-all" caching strategy was never going to work. The solution was an adaptive, stale-while-revalidate caching system. The application first loads instantly from the Gist cache, then checks the cache's timestamp against a stale time that varies based on the user's network size.

    Doing the Math

    To understand why this was critical, let's look at the numbers. GitHub's GraphQL API calculates rate limit costs based on the number of nodes requested, roughly 1 point per 100 nodes in the larger of the follower/following lists.

    • A Large Network (~25,000 connections) costs about 250 points per sync. Refreshing every 15 minutes (4 times/hour) would consume ~1,000 points, which is well within the 5,000-point limit.
    • An Extreme Network (>150,000 connections) costs about 1,500 points per sync. Refreshing every 15 minutes would consume ~6,000 points, exceeding the hourly limit and breaking the app.

    This calculation makes it clear that a dynamic approach based on network size isn't just a nice-to-have; it's essential for the tool to function for its target power users.

    Challenge 2: Detecting 'Ghost' Accounts via the Edge

    While testing, I stumbled upon "ghost" accounts: deleted users who still appear in follower lists and lead to a 404 page. The GitHub API provides no way to detect them. The key insight was two-fold: ghosts primarily matter when they are non-mutuals, and they almost always have 0 followers and 0 following.

    The final solution is a multi-step process:

    • On the client, the app filters the list of non-mutuals for any user with zero followers and following.
    • This much smaller list of "potential ghosts" is sent in small batches to a Vercel Edge Function.
    • The edge function performs a lightweight HEAD request for each username and returns a list of those that resulted in a 404, confirming them as ghosts.

    Challenge 3: Building a Responsive UI with Optimistic Updates

    A core goal was to eliminate friction by making actions instant and in-app. When a user follows or unfollows someone, I use an optimistic update. The UI reflects the change immediately, while the API call happens in the background. If it fails, React Query seamlessly reverts the UI and displays an error. I extended this pattern to bulk operations with a custom useBulkOperation hook, which iterates through a list of users, reports progress in real-time, and adds a small delay between requests to respect API limits.

    Code Spotlight: Orchestrating State with React Query and Zustand

    The combination of React Query and Zustand is the engine of the application's client-side. They work in perfect harmony because their roles are clearly defined:

    • React Query is the dedicated server state layer. Its only job is to manage the lifecycle of asynchronous data: fetching, caching, and handling background updates and errors.
    • Zustand is the client state and orchestration layer. It holds UI state (like filters) and, crucially, contains the business logic that calls upon React Query to fetch data.

    This synergy is best seen in the useNetworkData hook. The hook itself is a simple wrapper around React Query's useQuery. However, the queryFn it calls, initializeAndFetchNetwork, is an action from the Zustand store. This creates a clean separation where Zustand decides when and what to fetch, and React Query handles the how.

    Lessons Learned and The Road Ahead

    Building FollowSync was an exercise in pragmatic engineering and a rewarding way to jump back into development.

    • Client-Heavy is a Viable Strategy: Offloading long-running tasks to the client was the right call, allowing me to work within the constraints of a free, serverless platform without sacrificing core features.
    • Leverage the Platform You're On: Using Gists as a database was a reminder that creative solutions are often hiding in plain sight. It simplified the architecture, eliminated costs, and improved user data ownership.
    • Design for the Power User: Anticipating the needs of users with massive networks from the start forced me to build more robust and scalable solutions, which ultimately benefits everyone.

    While the core features are complete, the project is far from over. Next on the roadmap is adding user-configurable settings for cache stale times, giving power users even more control over how the application behaves.

    Challenge 2: Detecting 'Ghost' Accounts via the Edge
  • Snippet: useGhostDetector.ts
  • Snippet: route.ts
  • Challenge 3: Building a Responsive UI with Optimistic Updates
  • Snippet: useFollowManager.ts
  • Code Spotlight: Orchestrating State with React Query and Zustand
  • Snippet: useNetworkManager.ts
  • Snippet: cache.ts
  • Lessons Learned and The Road Ahead
  • The "GitHub-as-Infrastructure" Philosophy & Its Trade-offs

    Using Gists as a database is the cornerstone of the project's "GitHub-as-Infrastructure" approach. The benefits are significant:

    • Zero Cost: It leverages existing infrastructure, completely avoiding server and database hosting fees.
    • User-Owned Data: The analysis cache is stored in a Gist owned by the user, ensuring they have full control and privacy.
    • Built-in Versioning: Gists automatically track revision history, which opens the door for future features like analyzing network changes over time.

    However, this approach requires accepting certain trade-offs, which can be viewed through the lens of the CAP theorem's principles: Consistency, Availability, and Partition Tolerance. For a tool like FollowSync, you can only optimize for two of these three. The app prioritizes:

    • Availability: The app is always available and loads instantly because it reads directly from the Gist cache on the user's machine.
    • Partition Tolerance: The app remains functional even if it can't reach the GitHub API. Users can still browse and analyze their last-synced network data.

    It does this by sacrificing strict Consistency (or freshness). The data in the cache is a snapshot, not a real-time reflection of a user's GitHub network. This trade-off is the key to providing a fast, resilient, and free service.

    Image(1)

    View the full mermaid diagram here.

    The Development Journey: Engineering for the Edge Cases

    With the foundation laid, I ran into three interesting technical challenges that defined the project.

    Challenge 1: Engineering a Rate-Limit-Aware Caching System

    A "one-size-fits-all" caching strategy was never going to work. The solution was an adaptive, stale-while-revalidate caching system. The application first loads instantly from the Gist cache, then checks the cache's timestamp against a stale time that varies based on the user's network size.

    Doing the Math

    To understand why this was critical, let's look at the numbers. GitHub's GraphQL API calculates rate limit costs based on the number of nodes requested, roughly 1 point per 100 nodes in the larger of the follower/following lists.

    • A Large Network (~25,000 connections) costs about 250 points per sync. Refreshing every 15 minutes (4 times/hour) would consume ~1,000 points, which is well within the 5,000-point limit.
    • An Extreme Network (>150,000 connections) costs about 1,500 points per sync. Refreshing every 15 minutes would consume ~6,000 points, exceeding the hourly limit and breaking the app.

    This calculation makes it clear that a dynamic approach based on network size isn't just a nice-to-have; it's essential for the tool to function for its target power users.

    Challenge 2: Detecting 'Ghost' Accounts via the Edge

    While testing, I stumbled upon "ghost" accounts: deleted users who still appear in follower lists and lead to a 404 page. The GitHub API provides no way to detect them. The key insight was two-fold: ghosts primarily matter when they are non-mutuals, and they almost always have 0 followers and 0 following.

    The final solution is a multi-step process:

    • On the client, the app filters the list of non-mutuals for any user with zero followers and following.
    • This much smaller list of "potential ghosts" is sent in small batches to a Vercel Edge Function.
    • The edge function performs a lightweight HEAD request for each username and returns a list of those that resulted in a 404, confirming them as ghosts.

    Challenge 3: Building a Responsive UI with Optimistic Updates

    A core goal was to eliminate friction by making actions instant and in-app. When a user follows or unfollows someone, I use an optimistic update. The UI reflects the change immediately, while the API call happens in the background. If it fails, React Query seamlessly reverts the UI and displays an error. I extended this pattern to bulk operations with a custom useBulkOperation hook, which iterates through a list of users, reports progress in real-time, and adds a small delay between requests to respect API limits.

    Code Spotlight: Orchestrating State with React Query and Zustand

    The combination of React Query and Zustand is the engine of the application's client-side. They work in perfect harmony because their roles are clearly defined:

    • React Query is the dedicated server state layer. Its only job is to manage the lifecycle of asynchronous data: fetching, caching, and handling background updates and errors.
    • Zustand is the client state and orchestration layer. It holds UI state (like filters) and, crucially, contains the business logic that calls upon React Query to fetch data.

    This synergy is best seen in the useNetworkData hook. The hook itself is a simple wrapper around React Query's useQuery. However, the queryFn it calls, initializeAndFetchNetwork, is an action from the Zustand store. This creates a clean separation where Zustand decides when and what to fetch, and React Query handles the how.

    Lessons Learned and The Road Ahead

    Building FollowSync was an exercise in pragmatic engineering and a rewarding way to jump back into development.

    • Client-Heavy is a Viable Strategy: Offloading long-running tasks to the client was the right call, allowing me to work within the constraints of a free, serverless platform without sacrificing core features.
    • Leverage the Platform You're On: Using Gists as a database was a reminder that creative solutions are often hiding in plain sight. It simplified the architecture, eliminated costs, and improved user data ownership.
    • Design for the Power User: Anticipating the needs of users with massive networks from the start forced me to build more robust and scalable solutions, which ultimately benefits everyone.

    While the core features are complete, the project is far from over. Next on the roadmap is adding user-configurable settings for cache stale times, giving power users even more control over how the application behaves.

    cache.ts

    import { create } from 'zustand';
    import { toast } from 'sonner';
    import { getNonMutuals } from '@/lib/utils';
    import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
    import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
    import {
      GIST_ID_STORAGE_KEY,
      STALE_TIME_LARGE,
      STALE_TIME_MANUAL_ONLY,
      STALE_TIME_MEDIUM,
      STALE_TIME_SMALL,
    } from '@/lib/constants';
    import { UserInfoFragment } from '@/lib/gql/types';
    import { CachedData } from '@/lib/types';
    
      //other type definitions
    
    export type CacheStoreState = {
      network: {
        followers: UserInfoFragment[];
        following: UserInfoFragment[];
      };
      nonMutuals: {
        nonMutualsFollowingYou: UserInfoFragment[];
        nonMutualsYouFollow: UserInfoFragment[];
      };
      ghosts: UserInfoFragment[];
      ghostsSet: Set<string>;
      timestamp: number | null;
      isCheckingGhosts: boolean;
      gistName: string | null;
      metadata: CachedData['metadata'] | null;
      forceNextRefresh: boolean;
    };
    
    const initialState: CacheStoreState = {
      network: { followers: [], following: [] },
      nonMutuals: { nonMutualsFollowingYou: [], nonMutualsYouFollow: [] },
      ghosts: [],
      ghostsSet: new Set(),
      timestamp: null,
      isCheckingGhosts: true,
      gistName: null,
      metadata: null,
      forceNextRefresh: false,
    };
    
    
    export const useCacheStore = create<CacheStore>((set, get) => ({
      ...initialState,
    
      // other actions
    
      initializeAndFetchNetwork: async (
        client,
        username,
        accessToken,
        progress
      ) => {
        const { show, update, complete, fail } = progress;
        const gistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
        set({ gistName });
    
        const forceRefresh = get().forceNextRefresh;
        if (forceRefresh) {
          get().setForceNextRefresh(false); // Reset the flag
        }
    
        if (!forceRefresh) {
          const foundGist = await findCacheGist(client, gistName);
          if (foundGist) {
            const cachedData = parseCache(foundGist);
            if (cachedData) {
              get().setGistName(foundGist.name);
              const totalConnections = cachedData.metadata.totalConnections;
              let staleTime = 0;
    
              if (totalConnections < 2000) {
                staleTime = STALE_TIME_SMALL;
              } else if (totalConnections < 10000) {
                staleTime = STALE_TIME_MEDIUM;
              } else if (totalConnections < 50000) {
                staleTime = STALE_TIME_LARGE;
              } else {
                staleTime = STALE_TIME_MANUAL_ONLY;
              }
    
              const isStale = Date.now() - cachedData.timestamp > staleTime;
              get().loadFromCache(cachedData);
    
              if (!isStale) {
                console.log('Cache is fresh, returning data.');
                toast.info('Loaded fresh data from cache.');
                return cachedData.network;
              }
    
              // For manual-only tier, show a different message
              if (staleTime === STALE_TIME_MANUAL_ONLY) {
                console.warn(
                  'Cache is stale, but auto-refresh is disabled for large networks. Awaiting manual refresh.'
                );
                toast.info(
                  'Data loaded from cache. Refresh manually for the latest update.'
                );
                return cachedData.network;
              }
    
              console.warn(
                'Cache is stale, fetching fresh data in the background...'
              );
            }
          }
        }
    
        const fetchStart = performance.now();
        show({
          title: 'Syncing Your Network',
          message: 'Fetching connections from GitHub...',
          items: [
            { label: 'Followers', current: 0, total: 0 },
            { label: 'Following', current: 0, total: 0 },
          ],
        });
    
        try {
          const networkData = await fetchAllUserFollowersAndFollowing({
            client,
            username,
            onProgress: (p) => {
              update([
                {
                  label: 'Followers',
                  current: p.fetchedFollowers,
                  total: p.totalFollowers,
                },
                {
                  label: 'Following',
                  current: p.fetchedFollowing,
                  total: p.totalFollowing,
                },
              ]);
            },
          });
    
          const fetchEnd = performance.now();
          const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
          const followers = networkData.followers.nodes as UserInfoFragment[];
          const following = networkData.following.nodes as UserInfoFragment[];
          const network = { followers, following };
          const timestamp = Date.now();
    
          const dataToCache: CachedData = {
            network,
            ghosts: [],
            timestamp,
            metadata: {
              totalConnections: followers.length + following.length,
              fetchDuration,
              cacheVersion: '1.0',
            },
          };
    
          const newGist = await writeCache(
            accessToken,
            dataToCache,
            get().gistName
          );
    
          set({
            network,
            timestamp,
            nonMutuals: getNonMutuals(network),
            metadata: dataToCache.metadata,
          });
    
          get().setGistName(newGist.id);
          complete();
    
          return network;
          // eslint-disable-next-line @typescript-eslint/no-explicit-any
        } catch (err: any) {
          fail({ message: err.message || 'Failed to sync network.' });
          throw err;
        }
      },
    
      // other actions
    }));

    useFollowManager.ts

    import { useSession } from 'next-auth/react';
    import { useMutation } from '@tanstack/react-query';
    import { toast } from 'sonner';
    import { useCacheStore } from '@/lib/store/cache';
    import { followUser, unfollowUser } from '@/lib/gql/fetchers';
    import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
    import { getNonMutuals } from '@/lib/utils';
    import { UserInfoFragment } from '@/lib/gql/types';
    import { CachedData } from '@/lib/types';
    import { useModalStore } from '@/lib/store/modal';
    
    export const useFollowManager = () => {
      const { client } = useClientAuthenticatedGraphQLClient();
      const { data: session } = useSession();
      const { getState, gistName, writeCache, updateNetwork } = useCacheStore();
      const { incrementActionCount } = useModalStore();
    
      const persistChanges = async () => {
        if (!session?.accessToken) return;
    
        const currentState = getState();
        const { network, ghosts, metadata } = currentState;
    
        if (!network) return;
    
        const dataToCache: CachedData = {
          network,
          ghosts,
          timestamp: Date.now(),
          metadata: {
            totalConnections: network.followers.length + network.following.length,
            fetchDuration: metadata?.fetchDuration || 0,
            cacheVersion: metadata?.cacheVersion || '1.0',
          },
        };
    
        await writeCache(session.accessToken, dataToCache, gistName);
      };
    
      const followMutation = useMutation({
        mutationFn: (userToFollow: UserInfoFragment) => {
          if (!client) throw new Error('GraphQL client not available');
          return followUser({ client, userId: userToFollow.id });
        },
        onMutate: async (userToFollow: UserInfoFragment) => {
          const previousState = getState();
          const newFollowing = [...previousState.network.following, userToFollow];
          const newNetwork = { ...previousState.network, following: newFollowing };
          const newNonMutuals = getNonMutuals(newNetwork);
    
          updateNetwork({ network: newNetwork, nonMutuals: newNonMutuals });
    
          return { previousState };
        },
        onError: (err, userToFollow, context) => {
          if (context?.previousState) {
            updateNetwork({
              network: context.previousState.network,
              nonMutuals: context.previousState.nonMutuals,
            });
          }
          toast.error(`Failed to follow @${userToFollow.login}: ${err.message}`);
        },
      });
    
      const unfollowMutation = useMutation({
        mutationFn: (userToUnfollow: UserInfoFragment) => {
          if (!client) throw new Error('GraphQL client not available');
          return unfollowUser({ client, userId: userToUnfollow.id });
        },
        onMutate: async (userToUnfollow: UserInfoFragment) => {
          const previousState = getState();
          const newFollowing = previousState.network.following.filter(
            (u) => u.id !== userToUnfollow.id
          );
          const newNetwork = { ...previousState.network, following: newFollowing };
          const newNonMutuals = getNonMutuals(newNetwork);
    
          updateNetwork({ network: newNetwork, nonMutuals: newNonMutuals });
    
          return { previousState };
        },
        onError: (err, userToUnfollow, context) => {
          if (context?.previousState) {
            updateNetwork({
              network: context.previousState.network,
              nonMutuals: context.previousState.nonMutuals,
            });
          }
          toast.error(
            `Failed to unfollow @${userToUnfollow.login}: ${err.message}`
          );
        },
      });
    
      return {
        followMutation,
        unfollowMutation,
        persistChanges,
        incrementActionCount,
      };
    };

    cache.ts

    import { create } from 'zustand';
    import { toast } from 'sonner';
    import { getNonMutuals } from '@/lib/utils';
    import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
    import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
    import {
      GIST_ID_STORAGE_KEY,
      STALE_TIME_LARGE,
      STALE_TIME_MANUAL_ONLY,
      STALE_TIME_MEDIUM,
      STALE_TIME_SMALL,
    } from '@/lib/constants';
    import { UserInfoFragment } from '@/lib/gql/types';
    import { CachedData } from '@/lib/types';
    
      //other type definitions
    
    export type CacheStoreState = {
      network: {
        followers: UserInfoFragment[];
        following: UserInfoFragment[];
      };
      nonMutuals: {
        nonMutualsFollowingYou: UserInfoFragment[];
        nonMutualsYouFollow: UserInfoFragment[];
      };
      ghosts: UserInfoFragment[];
      ghostsSet: Set<string>;
      timestamp: number | null;
      isCheckingGhosts: boolean;
      gistName: string | null;
      metadata: CachedData['metadata'] | null;
      forceNextRefresh: boolean;
    };
    
    const initialState: CacheStoreState = {
      network: { followers: [], following: [] },
      nonMutuals: { nonMutualsFollowingYou: [], nonMutualsYouFollow: [] },
      ghosts: [],
      ghostsSet: new Set(),
      timestamp: null,
      isCheckingGhosts: true,
      gistName: null,
      metadata: null,
      forceNextRefresh: false,
    };
    
    
    export const useCacheStore = create<CacheStore>((set, get) => ({
      ...initialState,
    
      // other actions
    
      initializeAndFetchNetwork: async (
        client,
        username,
        accessToken,
        progress
      ) => {
        const { show, update, complete, fail } = progress;
        const gistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
        set({ gistName });
    
        const forceRefresh = get().forceNextRefresh;
        if (forceRefresh) {
          get().setForceNextRefresh(false); // Reset the flag
        }
    
        if (!forceRefresh) {
          const foundGist = await findCacheGist(client, gistName);
          if (foundGist) {
            const cachedData = parseCache(foundGist);
            if (cachedData) {
              get().setGistName(foundGist.name);
              const totalConnections = cachedData.metadata.totalConnections;
              let staleTime = 0;
    
              if (totalConnections < 2000) {
                staleTime = STALE_TIME_SMALL;
              } else if (totalConnections < 10000) {
                staleTime = STALE_TIME_MEDIUM;
              } else if (totalConnections < 50000) {
                staleTime = STALE_TIME_LARGE;
              } else {
                staleTime = STALE_TIME_MANUAL_ONLY;
              }
    
              const isStale = Date.now() - cachedData.timestamp > staleTime;
              get().loadFromCache(cachedData);
    
              if (!isStale) {
                console.log('Cache is fresh, returning data.');
                toast.info('Loaded fresh data from cache.');
                return cachedData.network;
              }
    
              // For manual-only tier, show a different message
              if (staleTime === STALE_TIME_MANUAL_ONLY) {
                console.warn(
                  'Cache is stale, but auto-refresh is disabled for large networks. Awaiting manual refresh.'
                );
                toast.info(
                  'Data loaded from cache. Refresh manually for the latest update.'
                );
                return cachedData.network;
              }
    
              console.warn(
                'Cache is stale, fetching fresh data in the background...'
              );
            }
          }
        }
    
        const fetchStart = performance.now();
        show({
          title: 'Syncing Your Network',
          message: 'Fetching connections from GitHub...',
          items: [
            { label: 'Followers', current: 0, total: 0 },
            { label: 'Following', current: 0, total: 0 },
          ],
        });
    
        try {
          const networkData = await fetchAllUserFollowersAndFollowing({
            client,
            username,
            onProgress: (p) => {
              update([
                {
                  label: 'Followers',
                  current: p.fetchedFollowers,
                  total: p.totalFollowers,
                },
                {
                  label: 'Following',
                  current: p.fetchedFollowing,
                  total: p.totalFollowing,
                },
              ]);
            },
          });
    
          const fetchEnd = performance.now();
          const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
          const followers = networkData.followers.nodes as UserInfoFragment[];
          const following = networkData.following.nodes as UserInfoFragment[];
          const network = { followers, following };
          const timestamp = Date.now();
    
          const dataToCache: CachedData = {
            network,
            ghosts: [],
            timestamp,
            metadata: {
              totalConnections: followers.length + following.length,
              fetchDuration,
              cacheVersion: '1.0',
            },
          };
    
          const newGist = await writeCache(
            accessToken,
            dataToCache,
            get().gistName
          );
    
          set({
            network,
            timestamp,
            nonMutuals: getNonMutuals(network),
            metadata: dataToCache.metadata,
          });
    
          get().setGistName(newGist.id);
          complete();
    
          return network;
          // eslint-disable-next-line @typescript-eslint/no-explicit-any
        } catch (err: any) {
          fail({ message: err.message || 'Failed to sync network.' });
          throw err;
        }
      },
    
      // other actions
    }));

    useFollowManager.ts

    import { useSession } from 'next-auth/react';
    import { useMutation } from '@tanstack/react-query';
    import { toast } from 'sonner';
    import { useCacheStore } from '@/lib/store/cache';
    import { followUser, unfollowUser } from '@/lib/gql/fetchers';
    import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
    import { getNonMutuals } from '@/lib/utils';
    import { UserInfoFragment } from '@/lib/gql/types';
    import { CachedData } from '@/lib/types';
    import { useModalStore } from '@/lib/store/modal';
    
    export const useFollowManager = () => {
      const { client } = useClientAuthenticatedGraphQLClient();
      const { data: session } = useSession();
      const { getState, gistName, writeCache, updateNetwork } = useCacheStore();
      const { incrementActionCount } = useModalStore();
    
      const persistChanges = async () => {
        if (!session?.accessToken) return;
    
        const currentState = getState();
        const { network, ghosts, metadata } = currentState;
    
        if (!network) return;
    
        const dataToCache: CachedData = {
          network,
          ghosts,
          timestamp: Date.now(),
          metadata: {
            totalConnections: network.followers.length + network.following.length,
            fetchDuration: metadata?.fetchDuration || 0,
            cacheVersion: metadata?.cacheVersion || '1.0',
          },
        };
    
        await writeCache(session.accessToken, dataToCache, gistName);
      };
    
      const followMutation = useMutation({
        mutationFn: (userToFollow: UserInfoFragment) => {
          if (!client) throw new Error('GraphQL client not available');
          return followUser({ client, userId: userToFollow.id });
        },
        onMutate: async (userToFollow: UserInfoFragment) => {
          const previousState = getState();
          const newFollowing = [...previousState.network.following, userToFollow];
          const newNetwork = { ...previousState.network, following: newFollowing };
          const newNonMutuals = getNonMutuals(newNetwork);
    
          updateNetwork({ network: newNetwork, nonMutuals: newNonMutuals });
    
          return { previousState };
        },
        onError: (err, userToFollow, context) => {
          if (context?.previousState) {
            updateNetwork({
              network: context.previousState.network,
              nonMutuals: context.previousState.nonMutuals,
            });
          }
          toast.error(`Failed to follow @${userToFollow.login}: ${err.message}`);
        },
      });
    
      const unfollowMutation = useMutation({
        mutationFn: (userToUnfollow: UserInfoFragment) => {
          if (!client) throw new Error('GraphQL client not available');
          return unfollowUser({ client, userId: userToUnfollow.id });
        },
        onMutate: async (userToUnfollow: UserInfoFragment) => {
          const previousState = getState();
          const newFollowing = previousState.network.following.filter(
            (u) => u.id !== userToUnfollow.id
          );
          const newNetwork = { ...previousState.network, following: newFollowing };
          const newNonMutuals = getNonMutuals(newNetwork);
    
          updateNetwork({ network: newNetwork, nonMutuals: newNonMutuals });
    
          return { previousState };
        },
        onError: (err, userToUnfollow, context) => {
          if (context?.previousState) {
            updateNetwork({
              network: context.previousState.network,
              nonMutuals: context.previousState.nonMutuals,
            });
          }
          toast.error(
            `Failed to unfollow @${userToUnfollow.login}: ${err.message}`
          );
        },
      });
    
      return {
        followMutation,
        unfollowMutation,
        persistChanges,
        incrementActionCount,
      };
    };
    import { useEffect } from 'react';
    import { useSession } from 'next-auth/react';
    import { useCacheStore } from '@/lib/store/cache';
    
    const BATCH_SIZE = 10;
    const DELAY_BETWEEN_BATCHES = 1000; // 1 second
    
    export const useGhostDetector = () => {
      const { data: session } = useSession();
      const accessToken = session?.accessToken;
      const {
        nonMutuals: { nonMutualsYouFollow, nonMutualsFollowingYou },
        ghosts,
        setGhosts,
        setIsCheckingGhosts,
      } = useCacheStore();
    
      useEffect(() => {
        setIsCheckingGhosts(true);
    
        const detectGhosts = async () => {
          if (
            ghosts.length > 0 ||
            nonMutualsYouFollow.length === 0 ||
            nonMutualsFollowingYou.length === 0 ||
            !accessToken
          ) {
            setIsCheckingGhosts(false);
            return;
          }
    
          const potentialGhosts = [
            ...nonMutualsYouFollow,
            ...nonMutualsFollowingYou,
          ].filter(
            (user) =>
              user?.followers.totalCount === 0 && user?.following.totalCount === 0
          );
    
          if (potentialGhosts.length === 0) {
            return;
          }
    
          const confirmedGhosts = [];
    
          for (let i = 0; i < potentialGhosts.length; i += BATCH_SIZE) {
            const batch = potentialGhosts.slice(i, i + BATCH_SIZE);
            const usernames = batch.map((user) => user?.login);
    
            try {
              const response = await fetch('/api/verify-ghosts', {
                method: 'POST',
                headers: {
                  'Content-Type': 'application/json',
                },
                body: JSON.stringify({ usernames }),
              });
    
              if (response.ok) {
                const { ghosts: ghostUsernames } = await response.json();
                const batchGhosts = batch.filter((user) =>
                  ghostUsernames.includes(user?.login)
                );
                confirmedGhosts.push(...batchGhosts);
              }
            } catch (error) {
              console.error('Error verifying ghost batch:', error);
            }
    
            if (i + BATCH_SIZE < potentialGhosts.length) {
              await new Promise((resolve) =>
                setTimeout(resolve, DELAY_BETWEEN_BATCHES)
              );
            }
          }
    
          setIsCheckingGhosts(false);
          await setGhosts(confirmedGhosts, accessToken);
        };
    
        detectGhosts();
      }, [
        nonMutualsFollowingYou,
        nonMutualsYouFollow,
        accessToken,
        setGhosts,
        ghosts.length,
        setIsCheckingGhosts,
      ]);
    };
    import { useEffect } from 'react';
    import { useSession } from 'next-auth/react';
    import { useCacheStore } from '@/lib/store/cache';
    
    const BATCH_SIZE = 10;
    const DELAY_BETWEEN_BATCHES = 1000; // 1 second
    
    export const useGhostDetector = () => {
      const { data: session } = useSession();
      const accessToken = session?.accessToken;
      const {
        nonMutuals: { nonMutualsYouFollow, nonMutualsFollowingYou },
        ghosts,
        setGhosts,
        setIsCheckingGhosts,
      } = useCacheStore();
    
      useEffect(() => {
        setIsCheckingGhosts(true);
    
        const detectGhosts = async () => {
          if (
            ghosts.length > 0 ||
            nonMutualsYouFollow.length === 0 ||
            nonMutualsFollowingYou.length === 0 ||
            !accessToken
          ) {
            setIsCheckingGhosts(false);
            return;
          }
    
          const potentialGhosts = [
            ...nonMutualsYouFollow,
            ...nonMutualsFollowingYou,
          ].filter(
            (user) =>
              user?.followers.totalCount === 0 && user?.following.totalCount === 0
          );
    
          if (potentialGhosts.length === 0) {
            return;
          }
    
          const confirmedGhosts = [];
    
          for (let i = 0; i < potentialGhosts.length; i += BATCH_SIZE) {
            const batch = potentialGhosts.slice(i, i + BATCH_SIZE);
            const usernames = batch.map((user) => user?.login);
    
            try {
              const response = await fetch('/api/verify-ghosts', {
                method: 'POST',
                headers: {
                  'Content-Type': 'application/json',
                },
                body: JSON.stringify({ usernames }),
              });
    
              if (response.ok) {
                const { ghosts: ghostUsernames } = await response.json();
                const batchGhosts = batch.filter((user) =>
                  ghostUsernames.includes(user?.login)
                );
                confirmedGhosts.push(...batchGhosts);
              }
            } catch (error) {
              console.error('Error verifying ghost batch:', error);
            }
    
            if (i + BATCH_SIZE < potentialGhosts.length) {
              await new Promise((resolve) =>
                setTimeout(resolve, DELAY_BETWEEN_BATCHES)
              );
            }
          }
    
          setIsCheckingGhosts(false);
          await setGhosts(confirmedGhosts, accessToken);
        };
    
        detectGhosts();
      }, [
        nonMutualsFollowingYou,
        nonMutualsYouFollow,
        accessToken,
        setGhosts,
        ghosts.length,
        setIsCheckingGhosts,
      ]);
    };
    import { useQuery } from '@tanstack/react-query';
    import { useSession } from 'next-auth/react';
    
    import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
    import { QUERY_KEY_USER_NETWORK } from '@/lib/constants';
    import { useCacheStore } from '@/lib/store/cache';
    import { useProgress } from '@/lib/context/progress';
    
    export const useNetworkData = (username?: string) => {
      const { client, status: authStatus } = useClientAuthenticatedGraphQLClient();
      const { data: session } = useSession();
      const { initializeAndFetchNetwork, setForceNextRefresh } = useCacheStore();
      const progress = useProgress();
    
      const queryResult = useQuery({
        queryKey: [QUERY_KEY_USER_NETWORK, username],
        queryFn: async () => {
          if (!client || !username || !session?.accessToken) {
            throw new Error('Client, username, or session not available.');
          }
          const data = await initializeAndFetchNetwork(
            client,
            username,
            session.accessToken,
            progress
          );
          return data;
        },
        enabled: !!client && authStatus === 'authenticated' && !!session,
        retry: false,
        staleTime: Infinity,
        refetchOnMount: false,
        refetchOnWindowFocus: false,
      });
    
      const forceRefetch = async () => {
        setForceNextRefresh(true);
        await queryResult.refetch();
      };
    
      return { ...queryResult, refetch: forceRefetch };
    };
    import { useQuery } from '@tanstack/react-query';
    import { useSession } from 'next-auth/react';
    
    import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
    import { QUERY_KEY_USER_NETWORK } from '@/lib/constants';
    import { useCacheStore } from '@/lib/store/cache';
    import { useProgress } from '@/lib/context/progress';
    
    export const useNetworkData = (username?: string) => {
      const { client, status: authStatus } = useClientAuthenticatedGraphQLClient();
      const { data: session } = useSession();
      const { initializeAndFetchNetwork, setForceNextRefresh } = useCacheStore();
      const progress = useProgress();
    
      const queryResult = useQuery({
        queryKey: [QUERY_KEY_USER_NETWORK, username],
        queryFn: async () => {
          if (!client || !username || !session?.accessToken) {
            throw new Error('Client, username, or session not available.');
          }
          const data = await initializeAndFetchNetwork(
            client,
            username,
            session.accessToken,
            progress
          );
          return data;
        },
        enabled: !!client && authStatus === 'authenticated' && !!session,
        retry: false,
        staleTime: Infinity,
        refetchOnMount: false,
        refetchOnWindowFocus: false,
      });
    
      const forceRefetch = async () => {
        setForceNextRefresh(true);
        await queryResult.refetch();
      };
    
      return { ...queryResult, refetch: forceRefetch };
    };