forked from datacommonsorg/website
-
Notifications
You must be signed in to change notification settings - Fork 0
merge from master #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
lucy-kind
wants to merge
2,946
commits into
lucy-kind:addsvs
Choose a base branch
from
datacommonsorg:master
base: addsvs
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
…adata, Add BigQuery Diff mode and other things for productionization (#5490) Major refactoring to the script to generate NL Metadata for Vertex AI: - Adds logging throughout the script to get a better understanding as execution is in progress. - Adds a BigQueryDiffs run mode as discussed previously to diff the content of the GCS Bucket with the latest content of BigQuery to ONLY process NEW stat vars. - Adds a Compaction run mode, as the bigquery diff run mode keeps adding new files on each run, a compaction script will run monthly to merge all the files to avoid bloat. - Improves failure retry mode to be able to run on the periodic/ folder easily. - Refactors helper functiosn and gemini-related functions into helper files. - Updates Readme and comments. - Improved tests Tested: Ran this script many times in bigquery diff run mode with a max stat vars to generate many files in a testing directory, then ran compaction script. Also tested in Datcom-nl project, see [Cloud run Jobs](https://pantheon.corp.google.com/run/jobs/details/us-central1/stat-var-metadata-generator/executions?e=13803378&inv=1&invt=Ab1zPA&mods=-monitoring_api_staging&project=datcom-nl) and [output file](https://pantheon.corp.google.com/storage/browser/metadata_for_vertexai_search/gmechali_csv_testing/periodic;tab=objects?e=13803378&inv=1&invt=Ab1m6g&mods=-monitoring_api_staging&project=datcom-ci&prefix=&forceOnObjectsSortingFiltering=false) To View in action, see the GCP [Workflow](https://pantheon.corp.google.com/workflows/workflow/us-central1/testing-workflow/executions?e=13803378&inv=1&invt=Ab1zPA&mods=-monitoring_api_staging&project=datcom-nl) which reprocesses all the stat vars in BigQuery Diff mode, then runs the compaction mode.
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
This PR: (1) Adds new place selector components shared between the map, scatter, and timeline tools, replacing custom (but similar) logic used for each tool. (2) Hooks up the new components on the map, scatter, and timeline tools under the `standardized_vis_tool` flag, including adding support for theming. (3) Adds tests for the `enable_feature=standardized_vis_tool` version of the tools, including tests for the new place selectors. (4) Pull out shared place-related constants and utils specific to the visualization tools into constants and utils files respectively. # Testing strategy: The new tests added also test place selector functionality. You can also verify manually by checking out this PR's code and spinning up a local instance. Click through the place selectors on the following pages: localhost:8080/tools/map localhost:8080/tools/scatter localhost:8080/tools/timeline # Before and After Screenshots: ## Map ### Before <img width="2560" height="1600" alt="datacommons org_tools_map(Nest Hub Max)" src="https://github.com/user-attachments/assets/8ab43167-f4aa-4684-a337-dac888187fea" /> <img width="250" alt="datacommons org_tools_map(Pixel 7)" src="https://github.com/user-attachments/assets/1f86b227-4415-4a79-bcb2-44a8fdc6fb0b" /> ### After With flag on: <img width="2560" height="1600" alt="localhost_8080_tools_map(Nest Hub Max)" src="https://github.com/user-attachments/assets/af23cacd-5f8e-49db-a7d2-cebe9c614a97" /> <img width="250" alt="localhost_8080_tools_map(iPhone SE)" src="https://github.com/user-attachments/assets/77b97d97-0e8c-4552-9fce-ef86b9422c43" /> With flag off: <img width="2560" height="1600" alt="localhost_8080_tools_map_disable_feature=standardized_vis_tool(Nest Hub Max)" src="https://github.com/user-attachments/assets/229229ff-e4da-46de-9957-10c571c937d0" /> <img width="250" alt="localhost_8080_tools_map_disable_feature=standardized_vis_tool(Pixel 7)" src="https://github.com/user-attachments/assets/1a5b6570-0711-43d7-aa8e-6755c82caeb4" /> ## Scatter ### Before <img width="2560" height="1600" alt="datacommons org_tools_scatter(Nest Hub Max)" src="https://github.com/user-attachments/assets/6527212b-bad9-4e36-941f-b025cd842ecb" /> <img width="250" alt="datacommons org_tools_scatter(Pixel 7)" src="https://github.com/user-attachments/assets/f11c3f63-b783-4bb2-977f-405e305786d0" /> ### After With flag on: <img width="2560" height="1600" alt="localhost_8080_tools_scatter(Nest Hub Max)" src="https://github.com/user-attachments/assets/7093b455-8b14-432e-8b71-f6d97c917673" /> <img width="250" alt="localhost_8080_tools_scatter(iPhone SE)" src="https://github.com/user-attachments/assets/6dcb1ac5-8738-4a57-aab6-938ae203e299" /> With flag off: <img width="2560" height="1600" alt="localhost_8080_tools_scatter_disable_feature=standardized_vis_tool(Nest Hub Max)" src="https://github.com/user-attachments/assets/34d1b780-e9f7-4b45-860d-1f39ce32442d" /> <img width="250" alt="localhost_8080_tools_scatter_disable_feature=standardized_vis_tool(Pixel 7) (1)" src="https://github.com/user-attachments/assets/9b7f7295-7bc9-484f-b2fb-816815623463" /> ## Timeline ### Before <img width="2560" height="1600" alt="datacommons org_tools_timeline(Nest Hub Max)" src="https://github.com/user-attachments/assets/e04a12aa-da9a-4af6-bf91-da846d6806d6" /> <img width="250" alt="datacommons org_tools_timeline(Pixel 7)" src="https://github.com/user-attachments/assets/7bc12c9b-21a2-4fa6-81f6-f43f28395fbd" /> ### After With flag on: <img width="2560" height="1600" alt="localhost_8080_tools_timeline(Nest Hub Max)" src="https://github.com/user-attachments/assets/71ca49bb-6949-4d90-ad40-04cf73ae3903" /> <img width="250" alt="localhost_8080_tools_timeline(iPhone SE) (2)" src="https://github.com/user-attachments/assets/1bf68e72-5a0a-447e-be63-cc6e0a75d657" /> With flag off: <img width="2560" height="1600" alt="localhost_8080_tools_timeline_disable_feature=standardized_vis_tool(Nest Hub Max)" src="https://github.com/user-attachments/assets/25f2301c-a66b-420d-af5b-24107cd9f42b" /> <img width="250" alt="localhost_8080_tools_timeline_disable_feature=standardized_vis_tool(Pixel 7)" src="https://github.com/user-attachments/assets/08024513-8f9b-4f9d-b3a7-aec59522be02" /> --------- Co-authored-by: Carolyn Au <[email protected]> Co-authored-by: Dan Noble <[email protected]>
## Description This PR fixes the bottom padding in the highlight result section. Previously, the subtopic section inside the highlight was rendering a bottom border (correctly) but without padding. ## Screenshots ### Before <img width="3008" height="1964" alt="Screenshot 2025-09-17 at 3 49 56 PM" src="https://github.com/user-attachments/assets/ea67c189-404f-4e83-a611-3b6d8b17062c" /> ### After <img width="1464" height="988" alt="Screenshot 2025-09-17 at 12 24 13 PM" src="https://github.com/user-attachments/assets/bf5bfcfc-71c7-4d84-805a-a4890be59bf1" /> ## Testing This scenario can be replicated at the following link: http://localhost:8080/explore?p=country/IND&sv=Amount_EconomicActivity_GrossDomesticProduction_Nominal&unit=USDollar&imp=WorldDevelopmentIndicators&obsPer=P1Y&chartType=TIMELINE_WITH_HIGHLIGHT&origin=aim Co-authored-by: Pablo Noel <[email protected]>
…5491) Made a few changes to increase quality of results: - On NGrams, only pass in the portion of the query after a LOCATION as identified by the Language Client. This does add some small latency to it (about 250ms from very rough approximation), but it significantly improves the results, so probably worth it. - For all suggestions, append "on earth" if there are no places in the query to make it an executable query. - Redirect the user to the /explore?p=Earth&sv=StatVar for exact StatVar selections. https://screencast.googleplex.com/cast/NjAzNTMzNjE2ODczNDcyMHwwZmM5YjQwOS03ZA
[South Korea Differ](https://storage.mtls.cloud.google.com/unresolved_mcf/country/southkorea/southkorea_nl_differ.pdf) [South Korea NL document](https://docs.google.com/document/d/1IWH6al4clq4XD0rot8OL5N1HCHghe1-9quC4Jc5I7JM/edit?resourcekey=0-wn1XLysh-oseCyY1UtCdCg&tab=t.0#heading=h.60omhjbn7i2s) --------- Co-authored-by: Rohit Kumar <[email protected]> Co-authored-by: Carolyn Au <[email protected]>
## Description This PR adds a webdriver snapshot for an explore page that demonstrates a highlight result. It is a follow-up to: #5501
Updating the version number from 1.5 to 2.5 for Gemini pro models
… Build & Deploy (#5498) This will ensure that GCP Error Reporting will fire on those logs so we can be alerted if we start receiving too many errors. This also creates the CloudBuild file to push the image to Artifact registry and deploy the Cloud Run Job to use the new latest image after the new one is available. I have also setup the Cloud Build Trigger for periodic rebuild of all binaries to run this cloud build file ensuring we keep our tools StatVar Metadata Generator in sync every day without requiring manual intervention.
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
## Description This PR contains measures to tighten four tests, all of which exhibited intermittent failures due to race conditions. * `test_bar_select_different_facet` and `test_per_capita_metadata`: These tests occasionally failed via the same mechanism. The presence of the element was waited for, but this sometimes resolved before the content tested for afterwards was ready. We resolved this by waiting for the relevant element to be visible and with that text. * `test_no_facet_choices_available` and `test_select_different_facet`: These tests both assumed the existence of certain elements after the chart finished loading (which was usually true). We now explicitly wait for them. ## Notes These were based on reports by @juliawu (thank you!). I could replicate the first two issues. However, I was unable to replicate the last two. Nevertheless, I did identify those possible routes to a race condition in those latter tests and resolved them.
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
…AI App with Periodic Updates (#5512) Productionized a new [Vertex AI Search App](https://pantheon.corp.google.com/gen-app-builder/locations/global/engines/full-statvar-search-prod-p_1757437817854/overview/system?e=13803378&mods=-monitoring_api_staging&project=datcom-nl) which is connected to periodic updates from added BigQuery statVars. This new Vertex AI Search app is connected to a datastore with periodic ingestion on my Cloud Storage bucket. There is a weekly GCP Workflow which triggers a series of Cloud Run Job to run the BigQuery Diff mode of the NL Metadata Generator and output it in the GCS bucket that has periodic ingestion from the data store. Evals ran against this new VAI App show similar results to the old VAI app.
…ompletion (#5511) This adds metrics for things such as SV autocompletion selection and trigger. And gives the breakdown with the correct parameters to better understand usage including query at selection, selected stat var and more. --------- Co-authored-by: Christie Ellks <[email protected]>
## Description This PR improves the ability of the observation specs library (as used by the API dialog) to discern between custom DCs and the standard instance. The previous logic relied solely on the existence of the apiRoot. If the apiRoot existed and was not the standard DC API root, we treated it as a custom DC. This worked for web components, where an apiRoot was passed into the web component in order to use a custom DC API root. However, this did not discern between custom DCs when in the explore context. In the explore context, the apiRoot is always undefined, and so we need another mechanism to determine if we are in a custom DC. ### Explore Context In the explore context, we are now relying on the `globalThis.isCustomDC ` attribute. There is an important caveat, which is that custom DCs are not guaranteed to supply this, as this is generated in the Jinja templates, and custom DCs usually, but do not always, set this variable. However, we do know that the primary DC does set this, and will be the only place that sets this to `0`. We use that knowledge to determine if we are in a custom environment: if `isCustomDC` exists and is `0`, we are not in a custom DC. Otherwise we are in a custom DC. If we are not in a custom DC, we use the `DEFAULT_API_V2_ENDPOINT` as before. Otherwise, we get the current origin from the URL, and append the custom DC endpoint to it. ### Web Component Context This behaves largely as it did before. We will always have an API root (because if one is not supplied, it is set to the default API root). If we are using the default API root, we consider ourselves not in a custom DC. Any other API root and we are in a custom DC. ## Notes As we were originally, we are assuming that a custom DC will have the `/core/api` API available. This is the case on staging, but is not the case locally when running `-e custom`. Thus locally, you will get a link that doesn't exist in custom mode. However, you will get a valid link in a true custom instance. An additional minor note: I moved the two API root and path related constants into the same constants file that contains the DEFAULT_API_ENDPOINT (used by the web components).
…tomated Deployments to Mixer and Website (#5413) This PR has quite a few different changes: 1. Setups up a clouddeploy.yaml to hold the configuration for our different Cloud Deploy release stages, including pre and post-deploy steps. 2. Sets up the Skaffold.yaml and updates all the necessary helm charts 3. Creates a Script to deploy to Cloud Endpoints - note that this is partially taken from the deploy_gke.sh (https://github.com/datacommonsorg/website/blob/27c6317cbdd85df7c42750b6117b993c84814476/scripts/deploy_gke_helm.sh#L106, soon to be deleted). 4. Checks in the code for the datacommons-script-runner image which is essentially gcloud with all the website code which contains all of or scripts. I've been using this image for all the pre/post deploy stages. Builds way faster than website image but should be optimized further. This code should be submitted WITH datacommonsorg/mixer#1582
…ags. (#5525) Adds the ability to rollout every feature flag at an experimental percentage.
Fixes the broken timeline tool links on /explore/equity by: (1) Switching to new links to the map tool, with more interesting/clear stat vars. (2) Updating the heading of the section to say "Visualization tool" instead of the Timeline tool. With this change, the links now go to: * Gini Index: https://datacommons.org/tools/visualization#visType%3Dmap%26place%3DEarth%26placeType%3DCountry%26sv%3D%7B%22dcid%22%3A%22GiniIndex_EconomicActivity%22%7D * Electricity Access: https://datacommons.org/tools/visualization#visType%3Dmap%26place%3DEarth%26placeType%3DCountry%26sv%3D%7B%22dcid%22%3A%22sdg%2FEG_ACS_ELEC%22%7D * Internet Access: https://datacommons.org/tools/visualization#visType%3Dmap%26place%3DEarth%26placeType%3DCountry%26sv%3D%7B%22dcid%22%3A%22sdg%2FIT_USE_ii99%22%7D * Access to basic water services: https://datacommons.org/tools/visualization#visType%3Dmap%26place%3DEarth%26placeType%3DCountry%26sv%3D%7B%22dcid%22%3A%22sdg%2FSH_H2O_SAFE%22%7D ### Screenshot <img width="1988" height="1314" alt="image" src="https://github.com/user-attachments/assets/c818313a-896f-4014-8553-b68978a2196d" />
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
…oml. Updated run_server.sh script to use uv for local development (#5762) Updated the Data Commons website python server to use uv for development mode only. * Updated `first_time_setup.md` instructions to include installing `uv` * Added `datacommons-website-server` package in `server/pyproject.toml` * Updated `./run_server.sh` to ensure `uv`, `protoc` are installed, then to start the development server with `uv`. Usage (error messsage before installing uv): ``` $ ./run_server.sh Error: uv could not be found. Please install it and try again. ``` After installing uv: ``` $ ./run_server.sh Resolved 166 packages in 3ms Audited 160 packages in 4ms Starting localhost with FLASK_ENV='local' on port='8080' Not in Google network: <urlopen error [Errno 8] nodename nor servname provided, or not known> [17:51:27][INFO ][util.py:593] https://autopush.api.datacommons.org/version is up running [17:51:27][INFO ][web_app.py:27] Run web server in local mode * Serving Flask app 'server.__init__' * Debug mode: on ``` Still todo: - Update website production build (`build/web_server`) to use uv ( - Update custom data commons build (`build/cdc_services`) to use uv - Update NL server (`./run_nl_server.sh` and all uses of `nl_app.py`) to use uv - Later: Update all tools/* to use uv --------- Co-authored-by: Christie Ellks <[email protected]>
…iment Flags (#5771) Both of these feature flags have equivalent *_ga flags, where the experiment and ga flags are rolled out in all teh same environments. Since the _ga flag is used as an OR'd condition with the _experiment flag, right now those flags are useless, so i will remove them. It is safe to remove the feature flags before the code changes are deployed because that code is never executed currently, since the _ga flag takes over. Changes to production feature flags will come in a separate PR. also removes page overview links - this is always disabled and is a subfeature of thepage overview.
Related PRs: * add cicd tag for mixer: datacommonsorg/mixer#1677 * previous example: https://github.com/datacommonsorg/website/pull/5702/files#diff-b1025b4cb9d51bffd16e92958b785525b20d645644cd00b38a9bacb7a9c6e935 --------- Co-authored-by: Gabriel Mechali <[email protected]>
The VAI Medium Threshold feature flag was used for experimenting but it turns out to be too high - we will not go for it. Deprecating it now, since it was never enabled in any environment, it is safe to remove the feature flags at the same time as we rip out the code. Production will be done separately.
This script loads all the feature flags from the github firectory and outputs something as shown below, which makes it easy to visualize which flags were deprecated, or only partially rolled out: From this, I have drawn thef ollowing conclusions: - Multiple features were never rolled out to dev. - show_api_modal, vai_medium_relevance_threshold are ready for deprecation - enable_gemini_2_5_flash and enable_gemini_2_5_flash_lite are ready for deprecation - follow_up_questions_experiment and page_overview_experiment are ready for deprecation In follow ups, I will get rid of some of these unneeded flags. This also makes it clear that custom DC is the main blocker for deprecating a lot more features. We will need a solution for this Feature Flag autopush custom dev local production staging autocomplete ✅ ❌ ✅ ✅ ✅ ✅ biomed_nl ❌ ❌ ✅ ✅ ❌ ❌ data_overview ✅ ❌ ✅ ✅ ❌ ❌ enable_gemini_2_5_flash ✅ ✅ ✅ ✅ ✅ ✅ enable_gemini_2_5_flash_lite ❌ ❌ ❌ ❌ MISSING ❌ enable_nl_agent_detector ✅ ❌ ✅ ✅ MISSING ❌ enable_ranking_tile_scroll ✅ ❌ ✅ ✅ MISSING ❌ enable_stat_var_autocomplete ✅ ❌ ✅ ✅ ❌ ✅ explore_result_header ✅ ❌ ✅ ✅ ✅ ✅ factcheck_redirect ✅ ❌ ✅ ❌ MISSING ❌ follow_up_questions_experiment ✅ ❌ ✅ ✅ ✅ ✅ follow_up_questions_ga ✅ ❌ ✅ ✅ ✅ ✅ metadata_modal ✅ ❌ ✅ ✅ ✅ ✅ page_overview_experiment ❌ ❌ ❌ ❌ ❌ ❌ page_overview_ga ❌ ❌ ❌ ❌ ❌ ❌ page_overview_links ✅ ❌ ✅ ✅ ✅ ✅ show_api_modal MISSING MISSING MISSING MISSING ❌ MISSING standardized_vis_tool ✅ ❌ ✅ ✅ MISSING ❌ vai_for_statvar_search ✅ ❌ ✅ ✅ ✅ ✅ vai_medium_relevance_threshold ❌ ❌ ❌ ❌ ❌ ❌ --------- Co-authored-by: Christie Ellks <[email protected]>
The enable_gemini_2_5_flash_lite flag was created but never enabled. We no longer need it. The enable_gemini_2_5_flash flag is now enabled in all environments but references to it in the code were deleted in 7102bd9 (2 months ago). It is safe to remove all those feature flags, Gemini 2.5 Flash will remain the fallback model. Removal of enable_gemini_2_5_flash in production will be done in a separate PR
Feature flag that will be used in #5215
This change tries to update the vite version from the dependabot alerts: https://github.com/datacommonsorg/website/security/dependabot?before=Y3Vyc29yOnYyOpLLP_GLQ5WBBiXPAAAAAWami18%3D&q=is%3Aopen+manifest%3Apackages%2Fclient%2Fpackage-lock.json%2Cpackages%2Fweb-components%2Fpackage-lock.json
Delete a diffing tool introduced in #4355 but is now unused since we are more reliant on screenshot testing
I think the intent from #5748 was to use this env -- ptal
Includes updates to core_topics and enum_topics scripts Also updates core_topics to use new staging env for api: https://undata-staging-datacommons-web-service-91813941917.us-central1.run.app * There's currently a bug with /v1/bulk/info/variable-group for custom DCs (fixed in datacommonsorg/mixer#1688), so running the core_topics script will cause an incomplete output until this fix is incorporated * Also it looks like the ILO data is removed/missing from this custom DC instance, so running core_topics with undata_ilo will produce an empty output Tested by running scripts and comparing output
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
The bulk_download tool was added to support a few data science courses. This hasn't gotten traffic in a while, and we no longer include the html elements for this link to show up. Deleting the package and replacing the url handler with a permanent redirect.
…5716) Bumps [js-yaml](https://github.com/nodeca/js-yaml) from 3.14.1 to 3.14.2. <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/nodeca/js-yaml/blob/master/CHANGELOG.md">js-yaml's changelog</a>.</em></p> <blockquote> <h2>[3.14.2] - 2025-11-15</h2> <h3>Security</h3> <ul> <li>Backported v4.1.1 fix to v3</li> </ul> <h2>[4.1.1] - 2025-11-12</h2> <h3>Security</h3> <ul> <li>Fix prototype pollution issue in yaml merge (<<) operator.</li> </ul> <h2>[4.1.0] - 2021-04-15</h2> <h3>Added</h3> <ul> <li>Types are now exported as <code>yaml.types.XXX</code>.</li> <li>Every type now has <code>options</code> property with original arguments kept as they were (see <code>yaml.types.int.options</code> as an example).</li> </ul> <h3>Changed</h3> <ul> <li><code>Schema.extend()</code> now keeps old type order in case of conflicts (e.g. Schema.extend([ a, b, c ]).extend([ b, a, d ]) is now ordered as <code>abcd</code> instead of <code>cbad</code>).</li> </ul> <h2>[4.0.0] - 2021-01-03</h2> <h3>Changed</h3> <ul> <li>Check <a href="https://github.com/nodeca/js-yaml/blob/master/migrate_v3_to_v4.md">migration guide</a> to see details for all breaking changes.</li> <li>Breaking: "unsafe" tags <code>!!js/function</code>, <code>!!js/regexp</code>, <code>!!js/undefined</code> are moved to <a href="https://github.com/nodeca/js-yaml-js-types">js-yaml-js-types</a> package.</li> <li>Breaking: removed <code>safe*</code> functions. Use <code>load</code>, <code>loadAll</code>, <code>dump</code> instead which are all now safe by default.</li> <li><code>yaml.DEFAULT_SAFE_SCHEMA</code> and <code>yaml.DEFAULT_FULL_SCHEMA</code> are removed, use <code>yaml.DEFAULT_SCHEMA</code> instead.</li> <li><code>yaml.Schema.create(schema, tags)</code> is removed, use <code>schema.extend(tags)</code> instead.</li> <li><code>!!binary</code> now always mapped to <code>Uint8Array</code> on load.</li> <li>Reduced nesting of <code>/lib</code> folder.</li> <li>Parse numbers according to YAML 1.2 instead of YAML 1.1 (<code>01234</code> is now decimal, <code>0o1234</code> is octal, <code>1:23</code> is parsed as string instead of base60).</li> <li><code>dump()</code> no longer quotes <code>:</code>, <code>[</code>, <code>]</code>, <code>(</code>, <code>)</code> except when necessary, <a href="https://redirect.github.com/nodeca/js-yaml/issues/470">#470</a>, <a href="https://redirect.github.com/nodeca/js-yaml/issues/557">#557</a>.</li> <li>Line and column in exceptions are now formatted as <code>(X:Y)</code> instead of <code>at line X, column Y</code> (also present in compact format), <a href="https://redirect.github.com/nodeca/js-yaml/issues/332">#332</a>.</li> <li>Code snippet created in exceptions now contains multiple lines with line numbers.</li> <li><code>dump()</code> now serializes <code>undefined</code> as <code>null</code> in collections and removes keys with <code>undefined</code> in mappings, <a href="https://redirect.github.com/nodeca/js-yaml/issues/571">#571</a>.</li> <li><code>dump()</code> with <code>skipInvalid=true</code> now serializes invalid items in collections as null.</li> <li>Custom tags starting with <code>!</code> are now dumped as <code>!tag</code> instead of <code>!<!tag></code>, <a href="https://redirect.github.com/nodeca/js-yaml/issues/576">#576</a>.</li> <li>Custom tags starting with <code>tag:yaml.org,2002:</code> are now shorthanded using <code>!!</code>, <a href="https://redirect.github.com/nodeca/js-yaml/issues/258">#258</a>.</li> </ul> <h3>Added</h3> <ul> <li>Added <code>.mjs</code> (es modules) support.</li> <li>Added <code>quotingType</code> and <code>forceQuotes</code> options for dumper to configure string literal style, <a href="https://redirect.github.com/nodeca/js-yaml/issues/290">#290</a>, <a href="https://redirect.github.com/nodeca/js-yaml/issues/529">#529</a>.</li> <li>Added <code>styles: { '!!null': 'empty' }</code> option for dumper (serializes <code>{ foo: null }</code> as "<code>foo: </code>"), <a href="https://redirect.github.com/nodeca/js-yaml/issues/570">#570</a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/nodeca/js-yaml/commit/9963d366dfbde0c69722452bcd40b41e7e4160a0"><code>9963d36</code></a> 3.14.2 released</li> <li><a href="https://github.com/nodeca/js-yaml/commit/10d3c8e70a6888543f5cdb656bb39f73e0ea77c1"><code>10d3c8e</code></a> dist rebuild</li> <li><a href="https://github.com/nodeca/js-yaml/commit/5278870a17454fe8621dbd8c445c412529525266"><code>5278870</code></a> fix prototype pollution in merge (<<) (<a href="https://redirect.github.com/nodeca/js-yaml/issues/731">#731</a>)</li> <li>See full diff in <a href="https://github.com/nodeca/js-yaml/compare/3.14.1...3.14.2">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/datacommonsorg/website/network/alerts). </details> Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Xiao Shi <[email protected]>
This change bumps up vite from 5.x to 7.x and react-syntax-highlighter from 15.x to 16.x Local build set to node version 23+, node 18.x is no longer competible with 5.x
local test and server runs without failure previous commit run succeed
The legacy place pages have been deprecated so there are no more calls to get_landing_page_data (https://github.com/search?q=repo%3Adatacommonsorg%2Fwebsite+get_landing_page_data&type=code)
This pull request updates the golden files automatically via Cloud Build. Please review the changes carefully. [Cloud Build Log](https://console.cloud.google.com/cloud-build/builds/5d462925-b04a-4470-b3c9-a2b1131ba05f?project=datcom-ci) --------- Co-authored-by: datacommons-robot-author <[email protected]> Co-authored-by: Rohit Kumar <[email protected]> Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
This PR automatically updates the `mixer` and `import` submodules to their latest `master` branches. Co-authored-by: datacommons-robot-author <[email protected]>
Resolve most packages except minimist and d3 update
NL dependabot link: https://github.com/datacommonsorg/website/security/dependabot?q=is%3Aopen+manifest%3Anl_requirements.txt Need to bump up torch and transformer
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
No description provided.