-
Couldn't load subscription status.
- Fork 92
test: deflake and another next 16 compat update for fixtures #3181
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
ae33942
152decc
76bccca
0e951a2
e47ff48
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,4 +1,4 @@ | ||
| import { unstable_cacheLife as cacheLife, unstable_cacheTag as cacheTag } from 'next/cache' | ||
| import { cacheLife, cacheTag } from '../../../../../next-cache' | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
But because we still will be running tests for next@15 I did add helper module for compatibility that will use either export with |
||
| import { | ||
| BasePageComponentProps, | ||
| getDataImplementation, | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,20 @@ | ||
| import * as NextCacheTyped from 'next/cache' | ||
|
|
||
| const NextCache = NextCacheTyped as any | ||
|
|
||
| export const cacheLife: any = | ||
| 'cacheLife' in NextCache | ||
| ? NextCache.cacheLife | ||
| : 'unstable_cacheLife' in NextCache | ||
| ? NextCache.unstable_cacheLife | ||
| : () => { | ||
| throw new Error('both unstable_cacheLife and cacheLife are missing from next/cache') | ||
| } | ||
| export const cacheTag: any = | ||
| 'cacheTag' in NextCache | ||
| ? NextCache.cacheTag | ||
| : 'unstable_cacheTag' in NextCache | ||
| ? NextCache.unstable_cacheTag | ||
| : () => { | ||
| throw new Error('both unstable_cacheTag and cacheTag are missing from next/cache') | ||
| } |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -542,12 +542,18 @@ export const fixtureFactories = { | |
| packagePath: 'apps/next-app', | ||
| buildCommand: 'nx run next-app:build', | ||
| publishDirectory: 'dist/apps/next-app/.next', | ||
| env: { | ||
| NX_ISOLATE_PLUGINS: 'false', | ||
| }, | ||
| }), | ||
| nxIntegratedDistDir: () => | ||
| createE2EFixture('nx-integrated', { | ||
| packagePath: 'apps/custom-dist-dir', | ||
| buildCommand: 'nx run custom-dist-dir:build', | ||
| publishDirectory: 'dist/apps/custom-dist-dir/dist', | ||
| env: { | ||
| NX_ISOLATE_PLUGINS: 'false', | ||
| }, | ||
|
Comment on lines
+545
to
+556
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nx on next canary starting failing with errors mentioned in nrwl/nx#27040 (but it's not exactly the same, because issue mention |
||
| }), | ||
| cliBeforeRegionalBlobsSupport: () => | ||
| createE2EFixture('cli-before-regional-blobs-support', { | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,41 @@ | ||
| // We are seeing quite a bit of 'fetch failed' cases in Github Actions that don't really reproduce | ||
| // locally. We are likely hitting some limits there when attempting to parallelize. They are not consistent | ||
| // so instead of reducing parallelism, we add a retry with backoff here. | ||
|
|
||
| const originalFetch = globalThis.fetch | ||
|
|
||
| const NUM_RETRIES = 5 | ||
|
|
||
| globalThis.fetch = async (...args) => { | ||
| let backoff = 100 | ||
| for (let attempt = 1; attempt <= NUM_RETRIES; attempt++) { | ||
| try { | ||
| return await originalFetch.apply(globalThis, args) | ||
| } catch (error) { | ||
| let shouldRetry = false | ||
| // not ideal, but there is no error code for that | ||
| if (error.message === 'fetch failed' && attempt < NUM_RETRIES) { | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this is happening very randomly and does not seem related to any specific fetch target:
it seems more related to runners and hitting potential limits there PS. I do hate it, but have no better idea as this would have to be replicated in a lot of places, some of which we don't control |
||
| // on this error we try again | ||
| shouldRetry = true | ||
| } | ||
|
|
||
| if (shouldRetry) { | ||
| // leave some trace in logs what's happening | ||
| console.error('[fetch-retry] fetch thrown, retrying...', { | ||
| args, | ||
| attempt, | ||
| errorMsg: error.message, | ||
| }) | ||
|
|
||
| const currentBackoff = backoff | ||
| await new Promise((resolve) => { | ||
| setTimeout(resolve, currentBackoff) | ||
| }) | ||
| backoff *= 2 | ||
| continue | ||
| } | ||
|
|
||
| throw error | ||
| } | ||
| } | ||
| } | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the Netlify Edge one was wrong for quite some time so this in practice was relying on hitting durable cache case