Avatar
Home
Projects
Contact
Available For Work
HomeProjectsContact
Local Time (Africa/Juba)
Local Time (Africa/Juba)
HomeProjectsContact
©2025, All Rights ReservedBuilt with by Maged

No Server, No Database, No Problem: The Client-Heavy Architecture of Follow Sync

A serverless GitHub manager for finding non-mutuals. Uses a client-heavy architecture with GitHub Gists as a database and adaptive caching to respect API limits.

Jul, 2025Completedvercel.appSource
Carousel image (1)
Carousel image (2)
Carousel image (3)

Table of Contents

  • Introduction
  • The Technical Blueprint: Pivoting to the Client
  • The "GitHub-as-Infrastructure" Philosophy & Its Trade-offs
  • The Development Journey: Engineering for the Edge Cases
  • Challenge 1: Engineering a Rate-Limit-Aware Caching System
  • Snippet: cache.ts
  • Challenge 2: Detecting 'Ghost' Accounts via the Edge
  • Snippet: useGhostDetector.ts
  • Snippet: route.ts
  • Challenge 3: Building a Responsive UI with Optimistic Updates
  • Snippet: useFollowManager.ts
  • Challenge 4: Empowering Users with Custom Settings
  • Snippet: settings.ts
  • Snippet: settingsModal.tsx
  • Code Spotlight: From Monolith to Modular State Management
  • Snippet: useCacheManager.ts
  • Snippet: network.ts
  • Snippet: ghost.ts
  • Snippet: gist.ts
  • Lessons Learned and The Road Ahead

Introduction

After a period of intense work through 2024, I took a necessary step back from development in early 2025. When I was ready to dive back in a few months later, I found my GitHub account buzzing with activity. My network had grown significantly, and I had no easy way to manage it or understand my connections.

My search for a solution led me to some existing tools, like Hesbon-Osoro's follow-for-follow-back project. It was a great starting point, but I quickly identified its limitations. To unfollow someone, I was redirected to their GitHub page to click the button manually, and there was no way to perform bulk actions. This friction sparked an idea. I would build my own tool, not just to solve my own problem, but also as the perfect challenge to reignite my passion for coding.

The Technical Blueprint: Pivoting to the Client

I had two core principles for this project: the architecture had to be simple, and it had to be free to run. I also knew that for a tool like this to be truly useful, it couldn't rely on a shared, easily exhausted API rate limit. The first major decision was to use a GitHub OAuth App, giving every user their own 5,000 requests/hour quota to work with.

My initial impulse was to use Next.js's server capabilities to fetch a user's entire network. However, I quickly realized this would be a dead end for power users. A full network sync for an account with tens of thousands of connections could easily exceed the execution limits on Vercel's free tier. Since the app's primary objective was completeness (you need the full picture to accurately calculate non-mutuals), a partial fetch wasn't an option. This forced a crucial pivot:

  • Follow Sync would be a client-heavy application. The user's own browser would handle the long-running data fetch, bypassing serverless timeouts entirely.

With the fetching strategy decided, I needed a place to cache the processed data. As I explored options, the solution hit me: why not use GitHub itself? I could leverage GitHub Gists as a user-owned database. For state management, I chose a combination of React Query for server state and Zustand for client state.

The "GitHub-as-Infrastructure" Philosophy & Its Trade-offs

Using Gists as a database is the cornerstone of the project's "GitHub-as-Infrastructure" approach. The benefits are significant:

  • Zero Cost: It leverages existing infrastructure, completely avoiding server and database hosting fees.
  • User-Owned Data: The analysis cache is stored in a Gist owned by the user, ensuring they have full control and privacy.
  • Built-in Versioning: Gists automatically track revision history, which opens the door for future features like analyzing network changes over time.

However, this approach requires accepting certain trade-offs, which can be viewed through the lens of the CAP theorem's principles: Consistency, Availability, and Partition Tolerance. For a tool like FollowSync, you can only optimize for two of these three. The app prioritizes:

  • Availability: The app is always available and loads instantly because it reads directly from the Gist cache on the user's machine.
  • Partition Tolerance: The app remains functional even if it can't reach the GitHub API. Users can still browse and analyze their last-synced network data.

It does this by sacrificing strict Consistency (or freshness). The data in the cache is a snapshot, not a real-time reflection of a user's GitHub network. This trade-off is the key to providing a fast, resilient, and free service.

Image(1)

View the full mermaid diagram here.

The Development Journey: Engineering for the Edge Cases

With the foundation laid, I ran into three interesting technical challenges that defined the project.

Challenge 1: Engineering a Rate-Limit-Aware Caching System

A "one-size-fits-all" caching strategy was never going to work. The solution was an adaptive, stale-while-revalidate caching system. The application first loads instantly from the Gist cache, then checks the cache's timestamp against a stale time that varies based on the user's network size.

Doing the Math

To understand why this was critical, let's look at the numbers. GitHub's GraphQL API calculates rate limit costs based on the number of nodes requested, roughly 1 point per 100 nodes in the larger of the follower/following lists.

  • A Large Network (~25,000 connections) costs about 250 points per sync. Refreshing every 15 minutes (4 times/hour) would consume ~1,000 points, which is well within the 5,000-point limit.
  • An Extreme Network (>150,000 connections) costs about 1,500 points per sync. Refreshing every 15 minutes would consume ~6,000 points, exceeding the hourly limit and breaking the app.

This calculation makes it clear that a dynamic approach based on network size isn't just a nice-to-have; it's essential for the tool to function for its target power users.

The Adaptive Cache Tiers

I developed a tiered caching strategy based on the total number of connections (followers + following). This ensures the cache remains fresh for smaller, more dynamic networks while preventing unnecessary API calls for larger, more stable ones.

Here's a detailed breakdown of the tiers:

  • Small Network (Up to 2,000 connections): At this level, a user's network can change frequently. A staleTime of 15 minutes ensures the data is reasonably current without being overly aggressive on the API.
  • Medium Network (2,001 to 10,000 connections): For these established users, the network is more stable. The staleTime is increased to 3 hours, balancing freshness with API conservation.
  • Large Network (10,001 to 50,000 connections): Fetching a network of this size is a significant operation. The assumption is that the overall structure doesn't change dramatically day-to-day. Therefore, the staleTime is set to 12 hours.
  • Extreme Network (50,001+ connections): For the true power users, fetching their network is an intensive task that could push API limits. At this tier, automatic refreshing is disabled (staleTime is effectively infinite), and the user is prompted to refresh the cache manually. This puts the user in full control, preventing unexpected, long-running syncs.

And for the ultimate power user, the new settings modal provides a custom staleTime option. This allows any user to override the adaptive tiers and set their own cache duration in minutes, offering the highest level of control over the application's behavior.

Challenge 2: Detecting 'Ghost' Accounts via the Edge

While testing, I stumbled upon "ghost" accounts: deleted users who still appear in follower lists and lead to a 404 page. The GitHub API provides no way to detect them. The key insight was two-fold: ghosts primarily matter when they are non-mutuals, and they almost always have 0 followers and 0 following.

The final solution is a multi-step process:

  • On the client, the app filters the list of non-mutuals for any user with zero followers and following.
  • This much smaller list of "potential ghosts" is sent in small batches to a Vercel Edge Function.
  • The edge function performs a lightweight HEAD request for each username and returns a list of those that resulted in a 404, confirming them as ghosts.

Challenge 3: Building a Responsive UI with Optimistic Updates

A core goal was to eliminate friction by making actions instant and in-app. When a user follows or unfollows someone, I use an optimistic update. The UI reflects the change immediately, while the API call happens in the background. If it fails, React Query seamlessly reverts the UI and displays an error. I extended this pattern to bulk operations with a custom useBulkOperation hook, which iterates through a list of users, reports progress in real-time, and adds a small delay between requests to respect API limits.

Challenge 4: Empowering Users with Custom Settings

From the outset, a core design principle for FollowSync was to empower "power users." While the adaptive caching system worked well for most, I wanted to provide more granular control, fully realizing the initial vision of a highly customizable tool. This led to the implementation of a user settings feature.

The primary goal was to allow users to override the default application behavior to better suit their specific needs. This included customizing the cache staleTime, adjusting the batch size for ghost detection, and tweaking UI elements like avatar visibility and pagination size.

The solution was a settings modal, managed by its own dedicated Zustand store, useSettingsStore. This approach kept the settings logic cleanly separated from other application state. The settings themselves are saved directly into the user's Gist cache as a distinct settings property, ensuring they persist across sessions. This implementation not only added the desired customizability but also reinforced the "GitHub-as-Infrastructure" philosophy by storing user preferences alongside their network data.

Code Spotlight: From Monolith to Modular State Management

As FollowSync grew, the initial state management solution, a single useCacheStore, began to show signs of strain. It was becoming a classic "god store"; a monolithic entity handling network data, ghost detection state, Gist interactions, and caching logic. This tight coupling made the store difficult to maintain, test, and extend. It was functional, but it didn't meet my standards for a clean, scalable architecture.

The solution was a significant refactor guided by the principle of separation of concerns. I broke down the monolithic useCacheStore into several smaller, more focused stores:

  • useNetworkStore: Manages only the follower and following data.
  • useGhostStore: Handles the state related to ghost account detection.
  • useGistStore: Manages interactions and metadata related to the Gist cache.

To orchestrate these now-independent stores, I introduced a centralized custom hook, useCacheManager. This orchestrator is responsible for the application's core logic: initializing data, fetching the network, handling the cache, and persisting changes. It acts as the "brain," calling on the specialized stores to manage their respective slices of state.

This refactor did introduce new challenges, particularly a race condition where ghost verification could trigger before the full network data was fetched. I resolved this by structuring the orchestrator's functions as callbacks, ensuring a sequential and predictable flow of operations. The result is a far more robust, modular, and maintainable state management system that is well-positioned for future development.

Lessons Learned and The Road Ahead

Building FollowSync was an exercise in pragmatic engineering and a rewarding way to jump back into development.

  • Client-Heavy is a Viable Strategy: Offloading long-running tasks to the client was the right call, allowing me to work within the constraints of a free, serverless platform without sacrificing core features.
  • Leverage the Platform You're On: Using Gists as a database was a reminder that creative solutions are often hiding in plain sight. It simplified the architecture, eliminated costs, and improved user data ownership.
  • Design for the Power User: Anticipating the needs of users with massive networks from the start forced me to build more robust and scalable solutions, which ultimately benefits everyone.
  • Refactor, Don't Hesitate: The move from a monolithic store to a modular, orchestrated state management system was a significant improvement. It underscored the importance of refactoring proactively when a solution no longer aligns with the project's long-term goals, even if it's currently "working."

With the core features and user settings now complete, the project has reached a new level of maturity. The road ahead is focused on continued maintenance, monitoring for any breaking changes in the GitHub API, and listening to user feedback for any future enhancements.

Share Project

No Server, No Database, No Problem: The Client-Heavy Architecture of Follow Sync

A serverless GitHub manager for finding non-mutuals. Uses a client-heavy architecture with GitHub Gists as a database and adaptive caching to respect API limits.

Jul, 2025Completedvercel.appSource
Carousel image (1)
Carousel image (2)
Carousel image (3)

Table of Contents

  • Introduction
  • The Technical Blueprint: Pivoting to the Client
  • The "GitHub-as-Infrastructure" Philosophy & Its Trade-offs
  • The Development Journey: Engineering for the Edge Cases
  • Challenge 1: Engineering a Rate-Limit-Aware Caching System
  • Snippet: cache.ts
  • Challenge 2: Detecting 'Ghost' Accounts via the Edge
  • Snippet: useGhostDetector.ts
  • Snippet: route.ts
  • Challenge 3: Building a Responsive UI with Optimistic Updates
  • Snippet: useFollowManager.ts
  • Challenge 4: Empowering Users with Custom Settings
  • Snippet: settings.ts
  • Snippet: settingsModal.tsx
  • Code Spotlight: From Monolith to Modular State Management
  • Snippet: useCacheManager.ts
  • Snippet: network.ts
  • Snippet: ghost.ts
  • Snippet: gist.ts
  • Lessons Learned and The Road Ahead

Introduction

After a period of intense work through 2024, I took a necessary step back from development in early 2025. When I was ready to dive back in a few months later, I found my GitHub account buzzing with activity. My network had grown significantly, and I had no easy way to manage it or understand my connections.

My search for a solution led me to some existing tools, like Hesbon-Osoro's follow-for-follow-back project. It was a great starting point, but I quickly identified its limitations. To unfollow someone, I was redirected to their GitHub page to click the button manually, and there was no way to perform bulk actions. This friction sparked an idea. I would build my own tool, not just to solve my own problem, but also as the perfect challenge to reignite my passion for coding.

The Technical Blueprint: Pivoting to the Client

I had two core principles for this project: the architecture had to be simple, and it had to be free to run. I also knew that for a tool like this to be truly useful, it couldn't rely on a shared, easily exhausted API rate limit. The first major decision was to use a GitHub OAuth App, giving every user their own 5,000 requests/hour quota to work with.

My initial impulse was to use Next.js's server capabilities to fetch a user's entire network. However, I quickly realized this would be a dead end for power users. A full network sync for an account with tens of thousands of connections could easily exceed the execution limits on Vercel's free tier. Since the app's primary objective was completeness (you need the full picture to accurately calculate non-mutuals), a partial fetch wasn't an option. This forced a crucial pivot:

  • Follow Sync would be a client-heavy application. The user's own browser would handle the long-running data fetch, bypassing serverless timeouts entirely.

With the fetching strategy decided, I needed a place to cache the processed data. As I explored options, the solution hit me: why not use GitHub itself? I could leverage GitHub Gists as a user-owned database. For state management, I chose a combination of React Query for server state and Zustand for client state.

The "GitHub-as-Infrastructure" Philosophy & Its Trade-offs

Using Gists as a database is the cornerstone of the project's "GitHub-as-Infrastructure" approach. The benefits are significant:

  • Zero Cost: It leverages existing infrastructure, completely avoiding server and database hosting fees.
  • User-Owned Data: The analysis cache is stored in a Gist owned by the user, ensuring they have full control and privacy.
  • Built-in Versioning: Gists automatically track revision history, which opens the door for future features like analyzing network changes over time.

However, this approach requires accepting certain trade-offs, which can be viewed through the lens of the CAP theorem's principles: Consistency, Availability, and Partition Tolerance. For a tool like FollowSync, you can only optimize for two of these three. The app prioritizes:

  • Availability: The app is always available and loads instantly because it reads directly from the Gist cache on the user's machine.
  • Partition Tolerance: The app remains functional even if it can't reach the GitHub API. Users can still browse and analyze their last-synced network data.

It does this by sacrificing strict Consistency (or freshness). The data in the cache is a snapshot, not a real-time reflection of a user's GitHub network. This trade-off is the key to providing a fast, resilient, and free service.

Image(1)

View the full mermaid diagram here.

The Development Journey: Engineering for the Edge Cases

With the foundation laid, I ran into three interesting technical challenges that defined the project.

Challenge 1: Engineering a Rate-Limit-Aware Caching System

A "one-size-fits-all" caching strategy was never going to work. The solution was an adaptive, stale-while-revalidate caching system. The application first loads instantly from the Gist cache, then checks the cache's timestamp against a stale time that varies based on the user's network size.

Doing the Math

To understand why this was critical, let's look at the numbers. GitHub's GraphQL API calculates rate limit costs based on the number of nodes requested, roughly 1 point per 100 nodes in the larger of the follower/following lists.

  • A Large Network (~25,000 connections) costs about 250 points per sync. Refreshing every 15 minutes (4 times/hour) would consume ~1,000 points, which is well within the 5,000-point limit.
  • An Extreme Network (>150,000 connections) costs about 1,500 points per sync. Refreshing every 15 minutes would consume ~6,000 points, exceeding the hourly limit and breaking the app.

This calculation makes it clear that a dynamic approach based on network size isn't just a nice-to-have; it's essential for the tool to function for its target power users.

The Adaptive Cache Tiers

I developed a tiered caching strategy based on the total number of connections (followers + following). This ensures the cache remains fresh for smaller, more dynamic networks while preventing unnecessary API calls for larger, more stable ones.

Here's a detailed breakdown of the tiers:

  • Small Network (Up to 2,000 connections): At this level, a user's network can change frequently. A staleTime of 15 minutes ensures the data is reasonably current without being overly aggressive on the API.
  • Medium Network (2,001 to 10,000 connections): For these established users, the network is more stable. The staleTime is increased to 3 hours, balancing freshness with API conservation.
  • Large Network (10,001 to 50,000 connections): Fetching a network of this size is a significant operation. The assumption is that the overall structure doesn't change dramatically day-to-day. Therefore, the staleTime is set to 12 hours.
  • Extreme Network (50,001+ connections): For the true power users, fetching their network is an intensive task that could push API limits. At this tier, automatic refreshing is disabled (staleTime is effectively infinite), and the user is prompted to refresh the cache manually. This puts the user in full control, preventing unexpected, long-running syncs.

And for the ultimate power user, the new settings modal provides a custom staleTime option. This allows any user to override the adaptive tiers and set their own cache duration in minutes, offering the highest level of control over the application's behavior.

Challenge 2: Detecting 'Ghost' Accounts via the Edge

While testing, I stumbled upon "ghost" accounts: deleted users who still appear in follower lists and lead to a 404 page. The GitHub API provides no way to detect them. The key insight was two-fold: ghosts primarily matter when they are non-mutuals, and they almost always have 0 followers and 0 following.

The final solution is a multi-step process:

  • On the client, the app filters the list of non-mutuals for any user with zero followers and following.
  • This much smaller list of "potential ghosts" is sent in small batches to a Vercel Edge Function.
  • The edge function performs a lightweight HEAD request for each username and returns a list of those that resulted in a 404, confirming them as ghosts.

Challenge 3: Building a Responsive UI with Optimistic Updates

A core goal was to eliminate friction by making actions instant and in-app. When a user follows or unfollows someone, I use an optimistic update. The UI reflects the change immediately, while the API call happens in the background. If it fails, React Query seamlessly reverts the UI and displays an error. I extended this pattern to bulk operations with a custom useBulkOperation hook, which iterates through a list of users, reports progress in real-time, and adds a small delay between requests to respect API limits.

Challenge 4: Empowering Users with Custom Settings

From the outset, a core design principle for FollowSync was to empower "power users." While the adaptive caching system worked well for most, I wanted to provide more granular control, fully realizing the initial vision of a highly customizable tool. This led to the implementation of a user settings feature.

The primary goal was to allow users to override the default application behavior to better suit their specific needs. This included customizing the cache staleTime, adjusting the batch size for ghost detection, and tweaking UI elements like avatar visibility and pagination size.

The solution was a settings modal, managed by its own dedicated Zustand store, useSettingsStore. This approach kept the settings logic cleanly separated from other application state. The settings themselves are saved directly into the user's Gist cache as a distinct settings property, ensuring they persist across sessions. This implementation not only added the desired customizability but also reinforced the "GitHub-as-Infrastructure" philosophy by storing user preferences alongside their network data.

Code Spotlight: From Monolith to Modular State Management

As FollowSync grew, the initial state management solution, a single useCacheStore, began to show signs of strain. It was becoming a classic "god store"; a monolithic entity handling network data, ghost detection state, Gist interactions, and caching logic. This tight coupling made the store difficult to maintain, test, and extend. It was functional, but it didn't meet my standards for a clean, scalable architecture.

The solution was a significant refactor guided by the principle of separation of concerns. I broke down the monolithic useCacheStore into several smaller, more focused stores:

  • useNetworkStore: Manages only the follower and following data.
  • useGhostStore: Handles the state related to ghost account detection.
  • useGistStore: Manages interactions and metadata related to the Gist cache.

To orchestrate these now-independent stores, I introduced a centralized custom hook, useCacheManager. This orchestrator is responsible for the application's core logic: initializing data, fetching the network, handling the cache, and persisting changes. It acts as the "brain," calling on the specialized stores to manage their respective slices of state.

This refactor did introduce new challenges, particularly a race condition where ghost verification could trigger before the full network data was fetched. I resolved this by structuring the orchestrator's functions as callbacks, ensuring a sequential and predictable flow of operations. The result is a far more robust, modular, and maintainable state management system that is well-positioned for future development.

Lessons Learned and The Road Ahead

Building FollowSync was an exercise in pragmatic engineering and a rewarding way to jump back into development.

  • Client-Heavy is a Viable Strategy: Offloading long-running tasks to the client was the right call, allowing me to work within the constraints of a free, serverless platform without sacrificing core features.
  • Leverage the Platform You're On: Using Gists as a database was a reminder that creative solutions are often hiding in plain sight. It simplified the architecture, eliminated costs, and improved user data ownership.
  • Design for the Power User: Anticipating the needs of users with massive networks from the start forced me to build more robust and scalable solutions, which ultimately benefits everyone.
  • Refactor, Don't Hesitate: The move from a monolithic store to a modular, orchestrated state management system was a significant improvement. It underscored the importance of refactoring proactively when a solution no longer aligns with the project's long-term goals, even if it's currently "working."

With the core features and user settings now complete, the project has reached a new level of maturity. The road ahead is focused on continued maintenance, monitoring for any breaking changes in the GitHub API, and listening to user feedback for any future enhancements.

Share Project

cache.ts

import { GraphQLClient } from 'graphql-request';
import { toast } from 'sonner';
import { useCallback } from 'react';

import { useNetworkStore } from '@/lib/store/network';
import { useGistStore } from '@/lib/store/gist';
import { useGhostStore } from '@/lib/store/ghost';
import { useSettingsStore } from '@/lib/store/settings';

import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
  GIST_ID_STORAGE_KEY,
  STALE_TIME_LARGE,
  STALE_TIME_MANUAL_ONLY,
  STALE_TIME_MEDIUM,
  STALE_TIME_SMALL,
} from '@/lib/constants';
import { CachedData, ProgressCallbacks } from '@/lib/types';
import { UserInfoFragment } from '@/lib/gql/types';

export const useCacheManager = () => {
  const setNetwork = useNetworkStore((state) => state.setNetwork);
  const { setGhosts,  } = useGhostStore();
  const { setGistName, setGistData, } = useGistStore();
  const settings = useSettingsStore();


  const loadFromCache = useCallback(
    (cachedData: CachedData) => {
      setNetwork(cachedData.network);
      setGhosts(cachedData.ghosts);
      setGistData({
        timestamp: cachedData.timestamp,
        metadata: cachedData.metadata,
      });

      if (cachedData.settings) {
        settings.setShowAvatars(cachedData.settings.showAvatars);
        settings.setGhostDetectionBatchSize(
          cachedData.settings.ghostDetectionBatchSize
        );
        settings.setPaginationPageSize(cachedData.settings.paginationPageSize);
        settings.setCustomStaleTime(cachedData.settings.customStaleTime);
      }
    },
    [setNetwork, setGhosts, setGistData, settings]
  );

  const initializeAndFetchNetwork = useCallback(
    async (
      client: GraphQLClient,
      username: string,
      accessToken: string,
      progress: ProgressCallbacks
    ) => {
      const { show, update, complete, fail } = progress;
      const localGistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
      setGistName(localGistName);

      const isForced = useGistStore.getState().forceNextRefresh;
      const currentGistName = useGistStore.getState().gistName;

      if (isForced) {
        useGistStore.getState().setForceNextRefresh(false);
      }

      if (!isForced) {
        const foundGist = await findCacheGist(client, currentGistName);
        if (foundGist) {
          const cachedData = parseCache(foundGist);
          if (cachedData) {
            setGistName(foundGist.name);
            const totalConnections = cachedData.metadata.totalConnections;
            const { customStaleTime } = settings;

            let staleTime = 0;
            if (customStaleTime) {
              staleTime = customStaleTime * 60 * 1000;
            } else if (totalConnections <= 2000) {
              staleTime = STALE_TIME_SMALL;
            } else if (totalConnections <= 10000) {
              staleTime = STALE_TIME_MEDIUM;
            } else if (totalConnections <= 50000) {
              staleTime = STALE_TIME_LARGE;
            } else {
              staleTime = STALE_TIME_MANUAL_ONLY;
            }

            const isStale = Date.now() - cachedData.timestamp > staleTime;
            loadFromCache(cachedData);

            if (!isStale) {
              toast.info('Loaded fresh data from cache.');
              return cachedData.network;
            }

            if (staleTime === STALE_TIME_MANUAL_ONLY) {
              toast.info(
                'Data loaded from cache. Refresh manually for the latest update.'
              );
              return cachedData.network;
            }
          }
        }
      }

      const fetchStart = performance.now();
      show({
        title: 'Syncing Your Network',
        message: 'Fetching connections from GitHub...',
        items: [
          { label: 'Followers', current: 0, total: 0 },
          { label: 'Following', current: 0, total: 0 },
        ],
      });

      try {
        const networkData = await fetchAllUserFollowersAndFollowing({
          client,
          username,
          onProgress: (p) => {
            update([
              {
                label: 'Followers',
                current: p.fetchedFollowers,
                total: p.totalFollowers,
              },
              {
                label: 'Following',
                current: p.fetchedFollowing,
                total: p.totalFollowing,
              },
            ]);
          },
        });

        const fetchEnd = performance.now();
        const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
        const followers = networkData.followers.nodes as UserInfoFragment[];
        const following = networkData.following.nodes as UserInfoFragment[];
        const network = { followers, following };
        const timestamp = Date.now();

        const dataToCache: CachedData = {
          network,
          ghosts: [],
          settings,
          timestamp,
          metadata: {
            totalConnections: followers.length + following.length,
            fetchDuration,
            cacheVersion: '1.0',
          },
        };

        const newGist = await writeCache(
          accessToken,
          dataToCache,
          currentGistName
        );

        setNetwork(network);
        setGistData({ timestamp, metadata: dataToCache.metadata });
        setGistName(newGist.id);
        complete();

        return network;
        // eslint-disable-next-line @typescript-eslint/no-explicit-any
      } catch (err: any) {
        fail({ message: err.message || 'Failed to sync network.' });
        throw err;
      }
    },
    [settings, loadFromCache, setGistName, setNetwork, setGistData]
  );

  // other functions

  return {
    initializeAndFetchNetwork,
    loadFromCache,
    // other functions
  };
};

useFollowManager.ts

import { useSession } from 'next-auth/react';
import { useMutation } from '@tanstack/react-query';
import { toast } from 'sonner';
import { useNetworkStore } from '@/lib/store/network';
import { followUser, unfollowUser } from '@/lib/gql/fetchers';
import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
import { UserInfoFragment } from '@/lib/gql/types';
import { useModalsStore } from '@/lib/store/modals';
import { useCacheManager } from './useCacheManager';

export const useFollowManager = () => {
  const { client } = useClientAuthenticatedGraphQLClient();
  const { data: session } = useSession();
  const { network, setNetwork } = useNetworkStore();
  const { persistChanges } = useCacheManager();
  const { incrementActionCount } = useModalsStore();

  const followMutation = useMutation({
    mutationFn: (userToFollow: UserInfoFragment) => {
      if (!client) throw new Error('GraphQL client not available');
      return followUser({ client, userId: userToFollow.id });
    },
    onMutate: async (userToFollow: UserInfoFragment) => {
      const previousNetwork = network;
      const newFollowing = [...network.following, userToFollow];
      const newNetwork = { ...network, following: newFollowing };
      setNetwork(newNetwork);
      return { previousNetwork };
    },
    onError: (err, userToFollow, context) => {
      if (context?.previousNetwork) {
        setNetwork(context.previousNetwork);
      }
      toast.error(`Failed to follow @${userToFollow.login}: ${err.message}`);
    },
    onSettled: () => {
      if (session?.accessToken) {
        persistChanges();
      }
    },
  });

  const unfollowMutation = useMutation({
    mutationFn: (userToUnfollow: UserInfoFragment) => {
      if (!client) throw new Error('GraphQL client not available');
      return unfollowUser({ client, userId: userToUnfollow.id });
    },
    onMutate: async (userToUnfollow: UserInfoFragment) => {
      const previousNetwork = network;
      const newFollowing = network.following.filter(
        (u) => u.id !== userToUnfollow.id
      );
      const newNetwork = { ...network, following: newFollowing };
      setNetwork(newNetwork);
      return { previousNetwork };
    },
    onError: (err, userToUnfollow, context) => {
      if (context?.previousNetwork) {
        setNetwork(context.previousNetwork);
      }
      toast.error(
        `Failed to unfollow @${userToUnfollow.login}: ${err.message}`
      );
    },
    onSettled: () => {
      if (session?.accessToken) {
        persistChanges();
      }
    },
  });

  return {
    followMutation,
    unfollowMutation,
    incrementActionCount,
  };
};

cache.ts

import { GraphQLClient } from 'graphql-request';
import { toast } from 'sonner';
import { useCallback } from 'react';

import { useNetworkStore } from '@/lib/store/network';
import { useGistStore } from '@/lib/store/gist';
import { useGhostStore } from '@/lib/store/ghost';
import { useSettingsStore } from '@/lib/store/settings';

import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
  GIST_ID_STORAGE_KEY,
  STALE_TIME_LARGE,
  STALE_TIME_MANUAL_ONLY,
  STALE_TIME_MEDIUM,
  STALE_TIME_SMALL,
} from '@/lib/constants';
import { CachedData, ProgressCallbacks } from '@/lib/types';
import { UserInfoFragment } from '@/lib/gql/types';

export const useCacheManager = () => {
  const setNetwork = useNetworkStore((state) => state.setNetwork);
  const { setGhosts,  } = useGhostStore();
  const { setGistName, setGistData, } = useGistStore();
  const settings = useSettingsStore();


  const loadFromCache = useCallback(
    (cachedData: CachedData) => {
      setNetwork(cachedData.network);
      setGhosts(cachedData.ghosts);
      setGistData({
        timestamp: cachedData.timestamp,
        metadata: cachedData.metadata,
      });

      if (cachedData.settings) {
        settings.setShowAvatars(cachedData.settings.showAvatars);
        settings.setGhostDetectionBatchSize(
          cachedData.settings.ghostDetectionBatchSize
        );
        settings.setPaginationPageSize(cachedData.settings.paginationPageSize);
        settings.setCustomStaleTime(cachedData.settings.customStaleTime);
      }
    },
    [setNetwork, setGhosts, setGistData, settings]
  );

  const initializeAndFetchNetwork = useCallback(
    async (
      client: GraphQLClient,
      username: string,
      accessToken: string,
      progress: ProgressCallbacks
    ) => {
      const { show, update, complete, fail } = progress;
      const localGistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
      setGistName(localGistName);

      const isForced = useGistStore.getState().forceNextRefresh;
      const currentGistName = useGistStore.getState().gistName;

      if (isForced) {
        useGistStore.getState().setForceNextRefresh(false);
      }

      if (!isForced) {
        const foundGist = await findCacheGist(client, currentGistName);
        if (foundGist) {
          const cachedData = parseCache(foundGist);
          if (cachedData) {
            setGistName(foundGist.name);
            const totalConnections = cachedData.metadata.totalConnections;
            const { customStaleTime } = settings;

            let staleTime = 0;
            if (customStaleTime) {
              staleTime = customStaleTime * 60 * 1000;
            } else if (totalConnections <= 2000) {
              staleTime = STALE_TIME_SMALL;
            } else if (totalConnections <= 10000) {
              staleTime = STALE_TIME_MEDIUM;
            } else if (totalConnections <= 50000) {
              staleTime = STALE_TIME_LARGE;
            } else {
              staleTime = STALE_TIME_MANUAL_ONLY;
            }

            const isStale = Date.now() - cachedData.timestamp > staleTime;
            loadFromCache(cachedData);

            if (!isStale) {
              toast.info('Loaded fresh data from cache.');
              return cachedData.network;
            }

            if (staleTime === STALE_TIME_MANUAL_ONLY) {
              toast.info(
                'Data loaded from cache. Refresh manually for the latest update.'
              );
              return cachedData.network;
            }
          }
        }
      }

      const fetchStart = performance.now();
      show({
        title: 'Syncing Your Network',
        message: 'Fetching connections from GitHub...',
        items: [
          { label: 'Followers', current: 0, total: 0 },
          { label: 'Following', current: 0, total: 0 },
        ],
      });

      try {
        const networkData = await fetchAllUserFollowersAndFollowing({
          client,
          username,
          onProgress: (p) => {
            update([
              {
                label: 'Followers',
                current: p.fetchedFollowers,
                total: p.totalFollowers,
              },
              {
                label: 'Following',
                current: p.fetchedFollowing,
                total: p.totalFollowing,
              },
            ]);
          },
        });

        const fetchEnd = performance.now();
        const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
        const followers = networkData.followers.nodes as UserInfoFragment[];
        const following = networkData.following.nodes as UserInfoFragment[];
        const network = { followers, following };
        const timestamp = Date.now();

        const dataToCache: CachedData = {
          network,
          ghosts: [],
          settings,
          timestamp,
          metadata: {
            totalConnections: followers.length + following.length,
            fetchDuration,
            cacheVersion: '1.0',
          },
        };

        const newGist = await writeCache(
          accessToken,
          dataToCache,
          currentGistName
        );

        setNetwork(network);
        setGistData({ timestamp, metadata: dataToCache.metadata });
        setGistName(newGist.id);
        complete();

        return network;
        // eslint-disable-next-line @typescript-eslint/no-explicit-any
      } catch (err: any) {
        fail({ message: err.message || 'Failed to sync network.' });
        throw err;
      }
    },
    [settings, loadFromCache, setGistName, setNetwork, setGistData]
  );

  // other functions

  return {
    initializeAndFetchNetwork,
    loadFromCache,
    // other functions
  };
};

useFollowManager.ts

import { useSession } from 'next-auth/react';
import { useMutation } from '@tanstack/react-query';
import { toast } from 'sonner';
import { useNetworkStore } from '@/lib/store/network';
import { followUser, unfollowUser } from '@/lib/gql/fetchers';
import { useClientAuthenticatedGraphQLClient } from '@/lib/gql/client';
import { UserInfoFragment } from '@/lib/gql/types';
import { useModalsStore } from '@/lib/store/modals';
import { useCacheManager } from './useCacheManager';

export const useFollowManager = () => {
  const { client } = useClientAuthenticatedGraphQLClient();
  const { data: session } = useSession();
  const { network, setNetwork } = useNetworkStore();
  const { persistChanges } = useCacheManager();
  const { incrementActionCount } = useModalsStore();

  const followMutation = useMutation({
    mutationFn: (userToFollow: UserInfoFragment) => {
      if (!client) throw new Error('GraphQL client not available');
      return followUser({ client, userId: userToFollow.id });
    },
    onMutate: async (userToFollow: UserInfoFragment) => {
      const previousNetwork = network;
      const newFollowing = [...network.following, userToFollow];
      const newNetwork = { ...network, following: newFollowing };
      setNetwork(newNetwork);
      return { previousNetwork };
    },
    onError: (err, userToFollow, context) => {
      if (context?.previousNetwork) {
        setNetwork(context.previousNetwork);
      }
      toast.error(`Failed to follow @${userToFollow.login}: ${err.message}`);
    },
    onSettled: () => {
      if (session?.accessToken) {
        persistChanges();
      }
    },
  });

  const unfollowMutation = useMutation({
    mutationFn: (userToUnfollow: UserInfoFragment) => {
      if (!client) throw new Error('GraphQL client not available');
      return unfollowUser({ client, userId: userToUnfollow.id });
    },
    onMutate: async (userToUnfollow: UserInfoFragment) => {
      const previousNetwork = network;
      const newFollowing = network.following.filter(
        (u) => u.id !== userToUnfollow.id
      );
      const newNetwork = { ...network, following: newFollowing };
      setNetwork(newNetwork);
      return { previousNetwork };
    },
    onError: (err, userToUnfollow, context) => {
      if (context?.previousNetwork) {
        setNetwork(context.previousNetwork);
      }
      toast.error(
        `Failed to unfollow @${userToUnfollow.login}: ${err.message}`
      );
    },
    onSettled: () => {
      if (session?.accessToken) {
        persistChanges();
      }
    },
  });

  return {
    followMutation,
    unfollowMutation,
    incrementActionCount,
  };
};
import { useEffect } from 'react';
import { useSession } from 'next-auth/react';
import { useGhostStore } from '@/lib/store/ghost';
import { useNetworkStore } from '@/lib/store/network';
import { useSettingsStore } from '@/lib/store/settings';
import { useCacheManager } from './useCacheManager';

const DELAY_BETWEEN_BATCHES = 1000; // 1 second

interface GhostDetectorProps {
  isNetworkReady: boolean;
}

export const useGhostDetector = ({ isNetworkReady }: GhostDetectorProps) => {
  const { data: session } = useSession();
  const accessToken = session?.accessToken;
  const { nonMutuals } = useNetworkStore();
  const { nonMutualsFollowingYou, nonMutualsYouFollow } = nonMutuals;
  const { ghosts, setIsCheckingGhosts } = useGhostStore();
  const { ghostDetectionBatchSize } = useSettingsStore();
  const { updateGhosts } = useCacheManager();

  useEffect(() => {
    const detectGhosts = async () => {
      if (!isNetworkReady || !accessToken) {
        setIsCheckingGhosts(false);
        return;
      }

      setIsCheckingGhosts(true);

      const potentialGhosts = [
        ...nonMutualsYouFollow,
        ...nonMutualsFollowingYou,
      ].filter(
        (user) =>
          user?.followers.totalCount === 0 && user?.following.totalCount === 0
      );

      const newPotentialGhosts = potentialGhosts.filter(
        (potentialGhost) =>
          !ghosts.some(
            (existingGhost) => existingGhost.login === potentialGhost.login
          )
      );

      if (newPotentialGhosts.length === 0) {
        setIsCheckingGhosts(false);
        return;
      }

      const confirmedGhosts = [];

      for (
        let i = 0;
        i < newPotentialGhosts.length;
        i += ghostDetectionBatchSize
      ) {
        const batch = newPotentialGhosts.slice(i, i + ghostDetectionBatchSize);
        const usernames = batch.map((user) => user?.login);

        try {
          const response = await fetch('/api/verify-ghosts', {
            method: 'POST',
            headers: {
              'Content-Type': 'application/json',
            },
            body: JSON.stringify({ usernames }),
          });

          if (response.ok) {
            const { ghosts: ghostUsernames } = await response.json();
            const batchGhosts = batch.filter((user) =>
              ghostUsernames.includes(user?.login)
            );
            confirmedGhosts.push(...batchGhosts);
          }
        } catch (error) {
          console.error('Error verifying ghost batch:', error);
        }

        if (i + ghostDetectionBatchSize < newPotentialGhosts.length) {
          await new Promise((resolve) =>
            setTimeout(resolve, DELAY_BETWEEN_BATCHES)
          );
        }
      }

      if (confirmedGhosts.length > 0) {
        await updateGhosts(confirmedGhosts);
      }

      setIsCheckingGhosts(false);
    };

    detectGhosts();
  }, [
    isNetworkReady,
    nonMutualsFollowingYou,
    nonMutualsYouFollow,
    accessToken,
    ghosts,
    setIsCheckingGhosts,
    ghostDetectionBatchSize,
    updateGhosts,
  ]);
};
import { useEffect } from 'react';
import { useSession } from 'next-auth/react';
import { useGhostStore } from '@/lib/store/ghost';
import { useNetworkStore } from '@/lib/store/network';
import { useSettingsStore } from '@/lib/store/settings';
import { useCacheManager } from './useCacheManager';

const DELAY_BETWEEN_BATCHES = 1000; // 1 second

interface GhostDetectorProps {
  isNetworkReady: boolean;
}

export const useGhostDetector = ({ isNetworkReady }: GhostDetectorProps) => {
  const { data: session } = useSession();
  const accessToken = session?.accessToken;
  const { nonMutuals } = useNetworkStore();
  const { nonMutualsFollowingYou, nonMutualsYouFollow } = nonMutuals;
  const { ghosts, setIsCheckingGhosts } = useGhostStore();
  const { ghostDetectionBatchSize } = useSettingsStore();
  const { updateGhosts } = useCacheManager();

  useEffect(() => {
    const detectGhosts = async () => {
      if (!isNetworkReady || !accessToken) {
        setIsCheckingGhosts(false);
        return;
      }

      setIsCheckingGhosts(true);

      const potentialGhosts = [
        ...nonMutualsYouFollow,
        ...nonMutualsFollowingYou,
      ].filter(
        (user) =>
          user?.followers.totalCount === 0 && user?.following.totalCount === 0
      );

      const newPotentialGhosts = potentialGhosts.filter(
        (potentialGhost) =>
          !ghosts.some(
            (existingGhost) => existingGhost.login === potentialGhost.login
          )
      );

      if (newPotentialGhosts.length === 0) {
        setIsCheckingGhosts(false);
        return;
      }

      const confirmedGhosts = [];

      for (
        let i = 0;
        i < newPotentialGhosts.length;
        i += ghostDetectionBatchSize
      ) {
        const batch = newPotentialGhosts.slice(i, i + ghostDetectionBatchSize);
        const usernames = batch.map((user) => user?.login);

        try {
          const response = await fetch('/api/verify-ghosts', {
            method: 'POST',
            headers: {
              'Content-Type': 'application/json',
            },
            body: JSON.stringify({ usernames }),
          });

          if (response.ok) {
            const { ghosts: ghostUsernames } = await response.json();
            const batchGhosts = batch.filter((user) =>
              ghostUsernames.includes(user?.login)
            );
            confirmedGhosts.push(...batchGhosts);
          }
        } catch (error) {
          console.error('Error verifying ghost batch:', error);
        }

        if (i + ghostDetectionBatchSize < newPotentialGhosts.length) {
          await new Promise((resolve) =>
            setTimeout(resolve, DELAY_BETWEEN_BATCHES)
          );
        }
      }

      if (confirmedGhosts.length > 0) {
        await updateGhosts(confirmedGhosts);
      }

      setIsCheckingGhosts(false);
    };

    detectGhosts();
  }, [
    isNetworkReady,
    nonMutualsFollowingYou,
    nonMutualsYouFollow,
    accessToken,
    ghosts,
    setIsCheckingGhosts,
    ghostDetectionBatchSize,
    updateGhosts,
  ]);
};
import { create } from 'zustand';
import { PAGE_SIZE_LIST } from '../constants';

export type SettingsState = {
  isSettingsModalOpen: boolean;
  showAvatars: boolean;
  ghostDetectionBatchSize: number;
  paginationPageSize: number;
  customStaleTime: number | null;
};

export type SettingsActions = {
  toggleSettingsModal: () => void;
  setShowAvatars: (show: boolean) => void;
  setGhostDetectionBatchSize: (size: number) => void;
  setPaginationPageSize: (size: number) => void;
  setCustomStaleTime: (time: number | null) => void;
  saveSettings: (
    accessToken: string,
    persistChanges: (accessToken: string) => Promise<void>
  ) => Promise<void>;
};

export type SettingsStore = SettingsState & SettingsActions;

export const useSettingsStore = create<SettingsStore>((set) => ({
  isSettingsModalOpen: false,
  showAvatars: true,
  ghostDetectionBatchSize: 10,
  paginationPageSize: PAGE_SIZE_LIST[0],
  customStaleTime: null,
  toggleSettingsModal: () =>
    set((state) => ({ isSettingsModalOpen: !state.isSettingsModalOpen })),
  setShowAvatars: (show) => set({ showAvatars: show }),
  setGhostDetectionBatchSize: (size) => set({ ghostDetectionBatchSize: size }),
  setPaginationPageSize: (size) => set({ paginationPageSize: size }),
  setCustomStaleTime: (time) => set({ customStaleTime: time }),
  saveSettings: async (accessToken, persistChanges) => {
    await persistChanges(accessToken);
  },
}));
import { create } from 'zustand';
import { PAGE_SIZE_LIST } from '../constants';

export type SettingsState = {
  isSettingsModalOpen: boolean;
  showAvatars: boolean;
  ghostDetectionBatchSize: number;
  paginationPageSize: number;
  customStaleTime: number | null;
};

export type SettingsActions = {
  toggleSettingsModal: () => void;
  setShowAvatars: (show: boolean) => void;
  setGhostDetectionBatchSize: (size: number) => void;
  setPaginationPageSize: (size: number) => void;
  setCustomStaleTime: (time: number | null) => void;
  saveSettings: (
    accessToken: string,
    persistChanges: (accessToken: string) => Promise<void>
  ) => Promise<void>;
};

export type SettingsStore = SettingsState & SettingsActions;

export const useSettingsStore = create<SettingsStore>((set) => ({
  isSettingsModalOpen: false,
  showAvatars: true,
  ghostDetectionBatchSize: 10,
  paginationPageSize: PAGE_SIZE_LIST[0],
  customStaleTime: null,
  toggleSettingsModal: () =>
    set((state) => ({ isSettingsModalOpen: !state.isSettingsModalOpen })),
  setShowAvatars: (show) => set({ showAvatars: show }),
  setGhostDetectionBatchSize: (size) => set({ ghostDetectionBatchSize: size }),
  setPaginationPageSize: (size) => set({ paginationPageSize: size }),
  setCustomStaleTime: (time) => set({ customStaleTime: time }),
  saveSettings: async (accessToken, persistChanges) => {
    await persistChanges(accessToken);
  },
}));
import { GraphQLClient } from 'graphql-request';
import { toast } from 'sonner';
import { useCallback } from 'react';

import { useNetworkStore } from '@/lib/store/network';
import { useGistStore } from '@/lib/store/gist';
import { useGhostStore } from '@/lib/store/ghost';
import { useSettingsStore } from '@/lib/store/settings';

import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
  GIST_ID_STORAGE_KEY,
  STALE_TIME_LARGE,
  STALE_TIME_MANUAL_ONLY,
  STALE_TIME_MEDIUM,
  STALE_TIME_SMALL,
} from '@/lib/constants';
import { CachedData, ProgressCallbacks } from '@/lib/types';
import { UserInfoFragment } from '@/lib/gql/types';
import { useSession } from 'next-auth/react';

export const useCacheManager = () => {
  const setNetwork = useNetworkStore((state) => state.setNetwork);
  const { setGhosts, addGhosts } = useGhostStore();
  const { setGistName, setGistData, setTimestamp } = useGistStore();
  const settings = useSettingsStore();

  const { data } = useSession();
  const accessToken = data?.accessToken;

  const loadFromCache = useCallback(
    (cachedData: CachedData) => {
      setNetwork(cachedData.network);
      setGhosts(cachedData.ghosts);
      setGistData({
        timestamp: cachedData.timestamp,
        metadata: cachedData.metadata,
      });

      if (cachedData.settings) {
        settings.setShowAvatars(cachedData.settings.showAvatars);
        settings.setGhostDetectionBatchSize(
          cachedData.settings.ghostDetectionBatchSize
        );
        settings.setPaginationPageSize(cachedData.settings.paginationPageSize);
        settings.setCustomStaleTime(cachedData.settings.customStaleTime);
      }
    },
    [setNetwork, setGhosts, setGistData, settings]
  );

  const initializeAndFetchNetwork = useCallback(
    async (
      client: GraphQLClient,
      username: string,
      accessToken: string,
      progress: ProgressCallbacks
    ) => {
      const { show, update, complete, fail } = progress;
      const localGistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
      setGistName(localGistName);

      const isForced = useGistStore.getState().forceNextRefresh;
      const currentGistName = useGistStore.getState().gistName;

      if (isForced) {
        useGistStore.getState().setForceNextRefresh(false);
      }

      if (!isForced) {
        const foundGist = await findCacheGist(client, currentGistName);
        if (foundGist) {
          const cachedData = parseCache(foundGist);
          if (cachedData) {
            setGistName(foundGist.name);
            const totalConnections = cachedData.metadata.totalConnections;
            const { customStaleTime } = settings;

            let staleTime = 0;
            if (customStaleTime) {
              staleTime = customStaleTime * 60 * 1000;
            } else if (totalConnections <= 2000) {
              staleTime = STALE_TIME_SMALL;
            } else if (totalConnections <= 10000) {
              staleTime = STALE_TIME_MEDIUM;
            } else if (totalConnections <= 50000) {
              staleTime = STALE_TIME_LARGE;
            } else {
              staleTime = STALE_TIME_MANUAL_ONLY;
            }

            const isStale = Date.now() - cachedData.timestamp > staleTime;
            loadFromCache(cachedData);

            if (!isStale) {
              toast.info('Loaded fresh data from cache.');
              return cachedData.network;
            }

            if (staleTime === STALE_TIME_MANUAL_ONLY) {
              toast.info(
                'Data loaded from cache. Refresh manually for the latest update.'
              );
              return cachedData.network;
            }
          }
        }
      }

      const fetchStart = performance.now();
      show({
        title: 'Syncing Your Network',
        message: 'Fetching connections from GitHub...',
        items: [
          { label: 'Followers', current: 0, total: 0 },
          { label: 'Following', current: 0, total: 0 },
        ],
      });

      try {
        const networkData = await fetchAllUserFollowersAndFollowing({
          client,
          username,
          onProgress: (p) => {
            update([
              {
                label: 'Followers',
                current: p.fetchedFollowers,
                total: p.totalFollowers,
              },
              {
                label: 'Following',
                current: p.fetchedFollowing,
                total: p.totalFollowing,
              },
            ]);
          },
        });

        const fetchEnd = performance.now();
        const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
        const followers = networkData.followers.nodes as UserInfoFragment[];
        const following = networkData.following.nodes as UserInfoFragment[];
        const network = { followers, following };
        const timestamp = Date.now();

        const dataToCache: CachedData = {
          network,
          ghosts: [],
          settings,
          timestamp,
          metadata: {
            totalConnections: followers.length + following.length,
            fetchDuration,
            cacheVersion: '1.0',
          },
        };

        const newGist = await writeCache(
          accessToken,
          dataToCache,
          currentGistName
        );

        setNetwork(network);
        setGistData({ timestamp, metadata: dataToCache.metadata });
        setGistName(newGist.id);
        complete();

        return network;
        // eslint-disable-next-line @typescript-eslint/no-explicit-any
      } catch (err: any) {
        fail({ message: err.message || 'Failed to sync network.' });
        throw err;
      }
    },
    [settings, loadFromCache, setGistName, setNetwork, setGistData]
  );

  const persistChanges = useCallback(async () => {
    if (!accessToken) return;

    const { network } = useNetworkStore.getState();
    const { ghosts } = useGhostStore.getState();
    const { metadata, gistName } = useGistStore.getState();
    const currentSettings = useSettingsStore.getState();

    if (!network || !metadata) return;
    const newTimestamp = Date.now();
    setTimestamp(newTimestamp);

    const dataToCache: CachedData = {
      network,
      ghosts,
      settings: currentSettings,
      timestamp: newTimestamp,
      metadata,
    };

    await writeCache(accessToken, dataToCache, gistName);
  }, [accessToken, setTimestamp]);

  const updateGhosts = useCallback(
    async (newGhosts: UserInfoFragment[]) => {
      addGhosts(newGhosts);
      await persistChanges();
    },
    [addGhosts, persistChanges]
  );

  return {
    initializeAndFetchNetwork,
    loadFromCache,
    persistChanges,
    updateGhosts,
  };
};
import { GraphQLClient } from 'graphql-request';
import { toast } from 'sonner';
import { useCallback } from 'react';

import { useNetworkStore } from '@/lib/store/network';
import { useGistStore } from '@/lib/store/gist';
import { useGhostStore } from '@/lib/store/ghost';
import { useSettingsStore } from '@/lib/store/settings';

import { findCacheGist, parseCache, writeCache } from '@/lib/gist';
import { fetchAllUserFollowersAndFollowing } from '@/lib/gql/fetchers';
import {
  GIST_ID_STORAGE_KEY,
  STALE_TIME_LARGE,
  STALE_TIME_MANUAL_ONLY,
  STALE_TIME_MEDIUM,
  STALE_TIME_SMALL,
} from '@/lib/constants';
import { CachedData, ProgressCallbacks } from '@/lib/types';
import { UserInfoFragment } from '@/lib/gql/types';
import { useSession } from 'next-auth/react';

export const useCacheManager = () => {
  const setNetwork = useNetworkStore((state) => state.setNetwork);
  const { setGhosts, addGhosts } = useGhostStore();
  const { setGistName, setGistData, setTimestamp } = useGistStore();
  const settings = useSettingsStore();

  const { data } = useSession();
  const accessToken = data?.accessToken;

  const loadFromCache = useCallback(
    (cachedData: CachedData) => {
      setNetwork(cachedData.network);
      setGhosts(cachedData.ghosts);
      setGistData({
        timestamp: cachedData.timestamp,
        metadata: cachedData.metadata,
      });

      if (cachedData.settings) {
        settings.setShowAvatars(cachedData.settings.showAvatars);
        settings.setGhostDetectionBatchSize(
          cachedData.settings.ghostDetectionBatchSize
        );
        settings.setPaginationPageSize(cachedData.settings.paginationPageSize);
        settings.setCustomStaleTime(cachedData.settings.customStaleTime);
      }
    },
    [setNetwork, setGhosts, setGistData, settings]
  );

  const initializeAndFetchNetwork = useCallback(
    async (
      client: GraphQLClient,
      username: string,
      accessToken: string,
      progress: ProgressCallbacks
    ) => {
      const { show, update, complete, fail } = progress;
      const localGistName = window.localStorage.getItem(GIST_ID_STORAGE_KEY);
      setGistName(localGistName);

      const isForced = useGistStore.getState().forceNextRefresh;
      const currentGistName = useGistStore.getState().gistName;

      if (isForced) {
        useGistStore.getState().setForceNextRefresh(false);
      }

      if (!isForced) {
        const foundGist = await findCacheGist(client, currentGistName);
        if (foundGist) {
          const cachedData = parseCache(foundGist);
          if (cachedData) {
            setGistName(foundGist.name);
            const totalConnections = cachedData.metadata.totalConnections;
            const { customStaleTime } = settings;

            let staleTime = 0;
            if (customStaleTime) {
              staleTime = customStaleTime * 60 * 1000;
            } else if (totalConnections <= 2000) {
              staleTime = STALE_TIME_SMALL;
            } else if (totalConnections <= 10000) {
              staleTime = STALE_TIME_MEDIUM;
            } else if (totalConnections <= 50000) {
              staleTime = STALE_TIME_LARGE;
            } else {
              staleTime = STALE_TIME_MANUAL_ONLY;
            }

            const isStale = Date.now() - cachedData.timestamp > staleTime;
            loadFromCache(cachedData);

            if (!isStale) {
              toast.info('Loaded fresh data from cache.');
              return cachedData.network;
            }

            if (staleTime === STALE_TIME_MANUAL_ONLY) {
              toast.info(
                'Data loaded from cache. Refresh manually for the latest update.'
              );
              return cachedData.network;
            }
          }
        }
      }

      const fetchStart = performance.now();
      show({
        title: 'Syncing Your Network',
        message: 'Fetching connections from GitHub...',
        items: [
          { label: 'Followers', current: 0, total: 0 },
          { label: 'Following', current: 0, total: 0 },
        ],
      });

      try {
        const networkData = await fetchAllUserFollowersAndFollowing({
          client,
          username,
          onProgress: (p) => {
            update([
              {
                label: 'Followers',
                current: p.fetchedFollowers,
                total: p.totalFollowers,
              },
              {
                label: 'Following',
                current: p.fetchedFollowing,
                total: p.totalFollowing,
              },
            ]);
          },
        });

        const fetchEnd = performance.now();
        const fetchDuration = Math.round((fetchEnd - fetchStart) / 1000);
        const followers = networkData.followers.nodes as UserInfoFragment[];
        const following = networkData.following.nodes as UserInfoFragment[];
        const network = { followers, following };
        const timestamp = Date.now();

        const dataToCache: CachedData = {
          network,
          ghosts: [],
          settings,
          timestamp,
          metadata: {
            totalConnections: followers.length + following.length,
            fetchDuration,
            cacheVersion: '1.0',
          },
        };

        const newGist = await writeCache(
          accessToken,
          dataToCache,
          currentGistName
        );

        setNetwork(network);
        setGistData({ timestamp, metadata: dataToCache.metadata });
        setGistName(newGist.id);
        complete();

        return network;
        // eslint-disable-next-line @typescript-eslint/no-explicit-any
      } catch (err: any) {
        fail({ message: err.message || 'Failed to sync network.' });
        throw err;
      }
    },
    [settings, loadFromCache, setGistName, setNetwork, setGistData]
  );

  const persistChanges = useCallback(async () => {
    if (!accessToken) return;

    const { network } = useNetworkStore.getState();
    const { ghosts } = useGhostStore.getState();
    const { metadata, gistName } = useGistStore.getState();
    const currentSettings = useSettingsStore.getState();

    if (!network || !metadata) return;
    const newTimestamp = Date.now();
    setTimestamp(newTimestamp);

    const dataToCache: CachedData = {
      network,
      ghosts,
      settings: currentSettings,
      timestamp: newTimestamp,
      metadata,
    };

    await writeCache(accessToken, dataToCache, gistName);
  }, [accessToken, setTimestamp]);

  const updateGhosts = useCallback(
    async (newGhosts: UserInfoFragment[]) => {
      addGhosts(newGhosts);
      await persistChanges();
    },
    [addGhosts, persistChanges]
  );

  return {
    initializeAndFetchNetwork,
    loadFromCache,
    persistChanges,
    updateGhosts,
  };
};