Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .flox/env/manifest.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ version = 1
# the `[install]` section.
[install]
# Python - Python itself is installed on activation via `uv sync`
uv = { pkg-path = "uv", pkg-group = "uv", version = "0.7.8" }
uv = { pkg-path = "uv", pkg-group = "uv", version = "0.8.19" }
xmlsec = { pkg-path = "xmlsec", version = "1.3.6" }
freetds = { pkg-path = "freetds" } # For pymssql
# Node
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build-hogql-parser.yml
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ jobs:

- uses: actions/setup-python@v5
with:
python-version: '3.12.11'
python-version: '3.13.7'

- name: Build sdist
if: matrix.os == 'ubuntu-22.04' # Only build the sdist once
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci-backend-update-test-timing.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ jobs:
concurrency: 1
group: 1
token: ${{ secrets.POSTHOG_BOT_PAT }}
python-version: '3.12.11'
python-version: '3.13.7'
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
segment: 'FOSS'
person-on-events: false
Expand Down
46 changes: 23 additions & 23 deletions .github/workflows/ci-backend.yml
Original file line number Diff line number Diff line change
Expand Up @@ -121,15 +121,15 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.12.11
python-version: 3.13.7
token: ${{ secrets.POSTHOG_BOT_PAT }}

- name: Install uv
id: setup-uv
uses: astral-sh/setup-uv@0c5e2b8115b80b4c7c5ddf6ffdd634974642d182 # v5.4.1
with:
enable-cache: true
version: 0.7.8
version: 0.8.19

- name: Install SAML (python3-saml) dependencies
if: steps.setup-uv.outputs.cache-hit != 'true'
Expand Down Expand Up @@ -327,7 +327,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ['3.12.11']
python-version: ['3.13.7']
clickhouse-server-image: ['clickhouse/clickhouse-server:25.6.9.98']
segment: ['Core']
person-on-events: [false]
Expand Down Expand Up @@ -380,121 +380,121 @@ jobs:
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 1
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 2
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 3
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 4
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 5
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 6
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 7
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 8
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 9
- segment: 'Core'
person-on-events: true
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 10
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 1
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 2
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 3
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 4
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 5
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 6
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 7
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 8
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 9
- segment: 'Temporal'
person-on-events: false
clickhouse-server-image: 'clickhouse/clickhouse-server:25.6.9.98'
python-version: '3.12.11'
python-version: '3.13.7'
concurrency: 10
group: 10

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/ci-e2e-playwright.yml
Original file line number Diff line number Diff line change
Expand Up @@ -313,15 +313,15 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.12.11
python-version: 3.13.7
token: ${{ secrets.POSTHOG_BOT_PAT }}

- name: Install uv
id: setup-uv
uses: astral-sh/setup-uv@0c5e2b8115b80b4c7c5ddf6ffdd634974642d182 # v5.4.1
with:
enable-cache: true
version: 0.7.8
version: 0.8.19

- name: Determine if hogql-parser has changed compared to master
shell: bash
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/ci-python.yml
Original file line number Diff line number Diff line change
Expand Up @@ -58,14 +58,14 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.12.11
python-version: 3.13.7

- name: Install uv
id: setup-uv
uses: astral-sh/setup-uv@0c5e2b8115b80b4c7c5ddf6ffdd634974642d182 # v5.4.1
with:
enable-cache: true
version: 0.7.8
version: 0.8.19

- name: Install SAML (python3-saml) dependencies
if: steps.setup-uv.outputs.cache-hit != 'true'
Expand Down Expand Up @@ -98,9 +98,9 @@ jobs:
uses: actions/cache@v4
with:
path: .mypy_cache
key: mypy-cache-${{ runner.os }}-3.12-${{ hashFiles('**/pyproject.toml', '**/mypy.ini') }}
key: mypy-cache-${{ runner.os }}-3.13-${{ hashFiles('**/pyproject.toml', '**/mypy.ini') }}
restore-keys: |
mypy-cache-${{ runner.os }}-3.12-
mypy-cache-${{ runner.os }}-3.13-

- name: Check static typing
shell: bash -e {0}
Expand Down
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@

# Compile and install system dependencies
# Add Confluent's client repository for librdkafka 2.10.1
RUN apt-get update && \

Check warning on line 55 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Pin versions in apt get install. Instead of `apt-get install <package>` use `apt-get install <package>=<version>`

Check warning on line 55 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Set the SHELL option -o pipefail before RUN with a pipe in it. If you are using /bin/sh in an alpine image or if your shell is symlinked to busybox then consider explicitly setting your SHELL to /bin/ash, or disable this check
apt-get install -y --no-install-recommends \
"wget" \
"gnupg" \
Expand Down Expand Up @@ -109,7 +109,7 @@
RUN NODE_OPTIONS="--max-old-space-size=16384" bin/turbo --filter=@posthog/cyclotron build

# Then build the plugin server with increased memory
RUN NODE_OPTIONS="--max-old-space-size=16384" bin/turbo --filter=@posthog/plugin-server build

Check notice on line 112 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Multiple consecutive `RUN` instructions. Consider consolidation.

# only prod dependencies in the node_module folder
# as we will copy it to the last image.
Expand All @@ -122,7 +122,7 @@
# ---------------------------------------------------------
#
# Same as pyproject.toml so that uv can pick it up and doesn't need to download a different Python version.
FROM python:3.12.11-slim-bookworm AS posthog-build
FROM python:3.13.7-slim-bookworm AS posthog-build
WORKDIR /code
SHELL ["/bin/bash", "-e", "-o", "pipefail", "-c"]

Expand All @@ -130,7 +130,7 @@
# We install those dependencies on a custom folder that we will
# then copy to the last image.
COPY pyproject.toml uv.lock ./
RUN apt-get update && \

Check warning on line 133 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Pin versions in apt get install. Instead of `apt-get install <package>` use `apt-get install <package>=<version>`
apt-get install -y --no-install-recommends \
"build-essential" \
"git" \
Expand Down Expand Up @@ -168,7 +168,7 @@
SHELL ["/bin/bash", "-e", "-o", "pipefail", "-c"]

# Fetch the GeoLite2-City database that will be used for IP geolocation within Django.
RUN apt-get update && \

Check warning on line 171 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Pin versions in apt get install. Instead of `apt-get install <package>` use `apt-get install <package>=<version>`
apt-get install -y --no-install-recommends \
"ca-certificates" \
"curl" \
Expand All @@ -184,7 +184,7 @@
# ---------------------------------------------------------
#
# NOTE: v1.32 is running bullseye, v1.33 is running bookworm
FROM unit:1.33.0-python3.12
FROM unit:1.33.0-python3.13
WORKDIR /code
SHELL ["/bin/bash", "-e", "-o", "pipefail", "-c"]
ENV PYTHONUNBUFFERED 1
Expand All @@ -192,7 +192,7 @@
# Install OS runtime dependencies.
# Note: please add in this stage runtime dependences only!
# Add Confluent's client repository for librdkafka runtime
RUN apt-get update && \

Check warning on line 195 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Pin versions in apt get install. Instead of `apt-get install <package>` use `apt-get install <package>=<version>`
apt-get install -y --no-install-recommends \
"wget" \
"gnupg" \
Expand All @@ -218,7 +218,7 @@
rm -rf /var/lib/apt/lists/*

# Install MS SQL dependencies
RUN curl https://packages.microsoft.com/keys/microsoft.asc | tee /etc/apt/trusted.gpg.d/microsoft.asc && \

Check notice on line 221 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Avoid additional packages by specifying `--no-install-recommends`

Check warning on line 221 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Pin versions in apt get install. Instead of `apt-get install <package>` use `apt-get install <package>=<version>`

Check warning on line 221 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Either use Wget or Curl but not both
curl https://packages.microsoft.com/config/debian/11/prod.list | tee /etc/apt/sources.list.d/mssql-release.list && \
apt-get update && \
ACCEPT_EULA=Y apt-get install -y msodbcsql18 && \
Expand All @@ -227,7 +227,7 @@
# Install Node.js 22.17.1 with architecture detection and verification
ENV NODE_VERSION 22.17.1

RUN ARCH= && dpkgArch="$(dpkg --print-architecture)" \

Check warning on line 230 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Either use Wget or Curl but not both
&& case "${dpkgArch##*-}" in \
amd64) ARCH='x64';; \
ppc64el) ARCH='ppc64le';; \
Expand Down Expand Up @@ -304,8 +304,8 @@

# Validate video export dependencies
RUN ffmpeg -version
RUN /python-runtime/bin/python -c "import playwright; print('Playwright package imported successfully')"

Check notice on line 307 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Multiple consecutive `RUN` instructions. Consider consolidation.
RUN /python-runtime/bin/python -c "from playwright.sync_api import sync_playwright; print('Playwright sync API available')"

Check notice on line 308 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Multiple consecutive `RUN` instructions. Consider consolidation.

# Copy the frontend assets from the frontend-build stage.
# TODO: this copy should not be necessary, we should remove it once we verify everything still works.
Expand Down Expand Up @@ -340,5 +340,5 @@
# Expose the port from which we serve OpenMetrics data.
EXPOSE 8001
COPY unit.json.tpl /docker-entrypoint.d/unit.json.tpl
USER root

Check warning on line 343 in Dockerfile

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Last USER should not be root
CMD ["./bin/docker"]
2 changes: 1 addition & 1 deletion Dockerfile.ai-evals
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Same as pyproject.toml so that uv can pick it up and doesn't need to download a different Python version.
FROM python:3.12.11-slim-bookworm AS python-base
FROM python:3.13.7-slim-bookworm AS python-base
FROM cruizba/ubuntu-dind:latest

Check warning on line 3 in Dockerfile.ai-evals

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Using latest is prone to errors if the image will ever update. Pin the version explicitly to a release tag
SHELL ["/bin/bash", "-e", "-o", "pipefail", "-c"]

# Copy Python base
Expand All @@ -13,7 +13,7 @@
COPY docker/ ./docker/

# Install system dependencies
RUN apt-get update && \

Check notice on line 16 in Dockerfile.ai-evals

View workflow job for this annotation

GitHub Actions / Lint changed Dockerfiles

Delete the apt-get lists after installing something
apt-get install -y --no-install-recommends \
"build-essential" \
"git" \
Expand Down
2 changes: 2 additions & 0 deletions common/hogql_parser/MANIFEST.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
include README.md
recursive-include hogql_parser-stubs *.pyi
4 changes: 2 additions & 2 deletions common/hogql_parser/setup.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import platform

from setuptools import Extension, setup
from setuptools import Extension, find_packages, setup

system = platform.system()
if system not in ("Darwin", "Linux"):
Expand Down Expand Up @@ -42,7 +42,6 @@
maintainer_email="[email protected]",
long_description=open("README.md").read(),
long_description_content_type="text/markdown",
packages=["hogql_parser-stubs"],
include_package_data=True,
ext_modules=[module],
python_requires=">=3.10",
Expand All @@ -55,5 +54,6 @@
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
],
)
10 changes: 5 additions & 5 deletions common/hogvm/python/stl/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -430,7 +430,7 @@ def apply_interval_to_datetime(dt: dict, interval: dict) -> dict:

zone = dt["zone"] if is_hog_datetime(dt) else "UTC"
if is_hog_datetime(dt):
base_dt = datetime.datetime.utcfromtimestamp(dt["dt"])
base_dt = datetime.datetime.fromtimestamp(dt["dt"], datetime.UTC)
base_dt = pytz.timezone(zone).localize(base_dt)
else:
base_dt = datetime.datetime(dt["year"], dt["month"], dt["day"], tzinfo=pytz.timezone(zone))
Expand Down Expand Up @@ -518,7 +518,7 @@ def date_diff(args: list[Any], team: Optional["Team"], stdout: Optional[list[str
def to_dt(obj):
if is_hog_datetime(obj):
z = obj["zone"]
return pytz.timezone(z).localize(datetime.datetime.utcfromtimestamp(obj["dt"]))
return pytz.timezone(z).localize(datetime.datetime.fromtimestamp(obj["dt"], datetime.UTC))
elif is_hog_date(obj):
return pytz.UTC.localize(datetime.datetime(obj["year"], obj["month"], obj["day"]))
else:
Expand Down Expand Up @@ -558,7 +558,7 @@ def date_trunc(args: list[Any], team: Optional["Team"], stdout: Optional[list[st
raise ValueError("Expected a DateTime for dateTrunc")

zone = dt["zone"]
base_dt = datetime.datetime.utcfromtimestamp(dt["dt"])
base_dt = datetime.datetime.fromtimestamp(dt["dt"], datetime.UTC)
base_dt = pytz.timezone(zone).localize(base_dt)

if unit == "year":
Expand Down Expand Up @@ -681,7 +681,7 @@ def extract(args: list[Any], team: Optional["Team"], stdout: Optional[list[str]]
def to_dt(obj):
if is_hog_datetime(obj):
z = obj["zone"]
return pytz.timezone(z).localize(datetime.datetime.utcfromtimestamp(obj["dt"]))
return pytz.timezone(z).localize(datetime.datetime.fromtimestamp(obj["dt"], datetime.UTC))
elif is_hog_date(obj):
return pytz.UTC.localize(datetime.datetime(obj["year"], obj["month"], obj["day"]))
else:
Expand Down Expand Up @@ -787,7 +787,7 @@ def toStartOfWeek(args: list[Any], team: Optional["Team"], stdout: Optional[list
dt = toDateTime(f"{dt['year']}-{dt['month']:02d}-{dt['day']:02d}")
else:
raise ValueError("Expected a Date or DateTime")
base_dt = datetime.datetime.utcfromtimestamp(dt["dt"])
base_dt = datetime.datetime.fromtimestamp(dt["dt"], datetime.UTC)
zone = dt["zone"]
base_dt = pytz.timezone(zone).localize(base_dt)
weekday = base_dt.isoweekday() # Monday=1, Sunday=7
Expand Down
2 changes: 1 addition & 1 deletion ee/api/sentry_stats.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ def get_stats_for_timerange(
@api_view(["GET"])
def sentry_stats(request: HttpRequest):
try:
current_time = datetime.utcnow()
current_time = datetime.now(datetime.UTC)
target_end_date = current_time.strftime("%Y-%m-%dT%H:%M:%S")
target_start_date = (current_time - timedelta(days=1)).strftime("%Y-%m-%dT%H:%M:%S")

Expand Down
2 changes: 1 addition & 1 deletion ee/tasks/auto_rollback_feature_flag.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ def check_condition(rollback_condition: dict, feature_flag: FeatureFlag) -> bool
base_start_date = created_date.strftime("%Y-%m-%dT%H:%M:%S")
base_end_date = (created_date + timedelta(days=1)).strftime("%Y-%m-%dT%H:%M:%S")

current_time = datetime.utcnow()
current_time = datetime.now(datetime.UTC)
target_end_date = current_time.strftime("%Y-%m-%dT%H:%M:%S")
target_start_date = (current_time - timedelta(days=1)).strftime("%Y-%m-%dT%H:%M:%S")

Expand Down
2 changes: 1 addition & 1 deletion ee/tasks/subscriptions/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -272,7 +272,7 @@ def schedule_all_subscriptions() -> None:
NOTE: This task is scheduled hourly just before the hour allowing for the 15 minute timedelta to cover
all upcoming hourly scheduled subscriptions
"""
now_with_buffer = datetime.utcnow() + timedelta(minutes=15)
now_with_buffer = datetime.now(datetime.UTC) + timedelta(minutes=15)
subscriptions = (
Subscription.objects.filter(next_delivery_date__lte=now_with_buffer, deleted=False)
.exclude(dashboard__deleted=True)
Expand Down
2 changes: 1 addition & 1 deletion posthog/api/test/__snapshots__/test_api_docs.ambr
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@
'/home/runner/work/posthog/posthog/products/tasks/backend/serializers.py: Warning [WorkflowStageViewSet > WorkflowStageSerializer]: unable to resolve type hint for function "get_agent". Consider using a type hint or @extend_schema_field. Defaulting to string.',
'/home/runner/work/posthog/posthog/products/tasks/backend/serializers.py: Warning [WorkflowStageViewSet > WorkflowStageSerializer]: unable to resolve type hint for function "get_task_count". Consider using a type hint or @extend_schema_field. Defaulting to string.',
'/home/runner/work/posthog/posthog/products/user_interviews/backend/api.py: Warning [UserInterviewViewSet]: could not derive type of path parameter "project_id" because model "products.user_interviews.backend.models.UserInterview" contained no such field. Consider annotating parameter with @extend_schema. Defaulting to "string".',
'/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/pydantic/_internal/_model_construction.py: Warning [QueryViewSet > ModelMetaclass]: Encountered 2 components with identical names "Person" and different identities <class \'str\'> and <class \'posthog.api.person.PersonSerializer\'>. This will very likely result in an incorrect schema. Try renaming one.',
'/opt/hostedtoolcache/Python/3.13.7/x64/lib/python3.13/site-packages/pydantic/_internal/_model_construction.py: Warning [QueryViewSet > ModelMetaclass]: Encountered 2 components with identical names "Person" and different identities <class \'str\'> and <class \'posthog.api.person.PersonSerializer\'>. This will very likely result in an incorrect schema. Try renaming one.',
'Warning: encountered multiple names for the same choice set (EffectiveMembershipLevelEnum). This may be unwanted even though the generated schema is technically correct. Add an entry to ENUM_NAME_OVERRIDES to fix the naming.',
'Warning: encountered multiple names for the same choice set (EffectivePrivilegeLevelEnum). This may be unwanted even though the generated schema is technically correct. Add an entry to ENUM_NAME_OVERRIDES to fix the naming.',
'Warning: encountered multiple names for the same choice set (HrefMatchingEnum). This may be unwanted even though the generated schema is technically correct. Add an entry to ENUM_NAME_OVERRIDES to fix the naming.',
Expand Down
4 changes: 2 additions & 2 deletions posthog/hogql_queries/utils/formula_ast.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,8 @@ def _evaluate(self, node, const_map: dict[str, Any]):
return operand
raise ValueError(f"Operator {unary_op.__class__.__name__} not supported")

elif isinstance(node, ast.Num):
return node.n
elif isinstance(node, ast.Constant) and isinstance(node.value, int | float):
return node.value

elif isinstance(node, ast.Name):
try:
Expand Down
2 changes: 1 addition & 1 deletion posthog/management/commands/analyze_behavioral_cohorts.py
Original file line number Diff line number Diff line change
Expand Up @@ -243,7 +243,7 @@ async def create_schedule(
schedule_id = f"behavioral-cohorts-{interval_minutes}min-{team_id or 'all'}-{int(time.time() * 1000)}"

# Calculate end time based on duration
start_time = datetime.utcnow()
start_time = datetime.now(datetime.UTC)
end_time = start_time + timedelta(hours=duration_hours)

# Calculate number of expected runs
Expand Down
Loading
Loading