Skip to content

Commit 6ac27dc

Browse files
authored
Update builder-love.md
added project milestones
1 parent 05c1416 commit 6ac27dc

File tree

1 file changed

+151
-25
lines changed

1 file changed

+151
-25
lines changed

applications/builder-love.md

+151-25
Original file line numberDiff line numberDiff line change
@@ -161,35 +161,161 @@ Resources required for migration:
161161
- Estimated hourly rate: $145
162162
- Total estimated expense: $23,200 (founder’s compensation)
163163
- Role will be fulfilled by founding member
164-
- Saas licenses for cloud hosting API and front end, AI tools, such as cursor.ai and gemini
165-
- google cloud sql postgres database: $150 per month
166-
- vercel for public beta: $20 per month
167-
- Cursor.ai: $65 per month
168-
- Gemini: $65 per month
169-
- Miscellaneous buffer: $700
170-
- Total cost: $1,000
171164

172-
**Why is this work necessary?**
173-
174-
We were able to create a minimum viable product using client-side json files generated locally using python. This process is fragile, not easily testable, will not scale, and client-side json is not suitable for complex queries. The project needs a server-side database and API to support growth beyond the MVP.
175-
176-
## Deliverables
165+
## Milestone 1 - create data infrastructure with production level availability and security
166+
- Estimated duration: 2 weeks
167+
- FTE: 2
168+
- Costs: $15,000
169+
170+
<table>
171+
<thead>
172+
<tr>
173+
<th>Number</th>
174+
<th>Deliverable</th>
175+
<th>Specification</th>
176+
</tr>
177+
</thead>
178+
<tbody>
179+
<tr>
180+
<td>0a.</td>
181+
<td>License</td>
182+
<td>MIT license/public repo</td>
183+
</tr>
184+
<tr>
185+
<td>ob.</td>
186+
<td>Documentation</td>
187+
<td>Data infrastructure architecture description</td>
188+
</tr>
189+
<tr>
190+
<td>1</td>
191+
<td>Database</td>
192+
<td>Create Google cloud sql postgres instance and schemas</td>
193+
</tr>
194+
<tr>
195+
<td>2</td>
196+
<td>Cloud SQL proxy client</td>
197+
<td>Configure Google sql proxy client/server daemon</td>
198+
</tr>
199+
<tr>
200+
<td>3</td>
201+
<td>Local network</td>
202+
<td>Configure Tailscale local client network for development environment</td>
203+
</tr>
204+
<tr>
205+
<td>4</td>
206+
<td>Data orchestration</td>
207+
<td>Configure Dagster daemons: webserver for monitoring, and dagster execution daemon</td>
208+
</tr>
209+
<tr>
210+
<td>5</td>
211+
<td>Database API</td>
212+
<td>Configure python Flask REST API on Google Cloud Run</td>
213+
</tr>
214+
<tr>
215+
<td>6</td>
216+
<td>API cache</td>
217+
<td>Configure Google Cloud Memorystore for Redis to cache API responses</td>
218+
</tr>
219+
</tbody>
220+
</table>
221+
222+
## Milestone 2 - create data interfaces
223+
- Estimated duration: 1 week
224+
- FTE: 2
225+
- Costs: $7,500
226+
227+
<table>
228+
<thead>
229+
<tr>
230+
<th>Number</th>
231+
<th>Deliverable</th>
232+
<th>Specification</th>
233+
</tr>
234+
</thead>
235+
<tbody>
236+
<tr>
237+
<td>0a.</td>
238+
<td>License</td>
239+
<td>MIT license/public repo</td>
240+
</tr>
241+
<tr>
242+
<td>ob.</td>
243+
<td>Documentation</td>
244+
<td>Document the first two data ingestion interfaces--Github API and Discourse API--and associated scripts</td>
245+
</tr>
246+
<tr>
247+
<td>1</td>
248+
<td>Compute environment</td>
249+
<td>Configure Google compute engine resources for data interfaces</td>
250+
</tr>
251+
<tr>
252+
<td>2</td>
253+
<td>Data interface scripts</td>
254+
<td>Rewrite data ingestion, cleaning, and loading scripts. Moving from local Python/sqlite/JSON model to cloud postgres/Dagster/API server model</td>
255+
</tr>
256+
<tr>
257+
<td>3</td>
258+
<td>Improve rate limit handling</td>
259+
<td>Refactor current data interface scripts to more efficiently handle API rate limits in order to save time and compute costs</td>
260+
</tr>
261+
<tr>
262+
<td>4</td>
263+
<td>dbt transform, load, and test</td>
264+
<td>Configure dbt for scheduled data normalization, cleaning, loading, and testing jobs i.e., moving data from raw -> clean -> api schemas with built in test scripts.</td>
265+
</tr>
266+
</tbody>
267+
</table>
268+
269+
## Milestone 3 - update charts and analytics
270+
- Estimated duration: 1 week
271+
- FTE: 2
272+
- Costs: $7,500
273+
274+
<table>
275+
<thead>
276+
<tr>
277+
<th>Number</th>
278+
<th>Deliverable</th>
279+
<th>Specification</th>
280+
</tr>
281+
</thead>
282+
<tbody>
283+
<tr>
284+
<td>0a.</td>
285+
<td>License</td>
286+
<td>MIT license/public repo</td>
287+
</tr>
288+
<tr>
289+
<td>ob.</td>
290+
<td>Documentation</td>
291+
<td>Document the postgres database API that the builder.love web app will use to request data. This includes available endpoints and access control configurations.</td>
292+
</tr>
293+
<tr>
294+
<td>1</td>
295+
<td>API authentication</td>
296+
<td>Write code to authenticate and connect to API from webapp</td>
297+
</tr>
298+
<tr>
299+
<td>2</td>
300+
<td>Query</td>
301+
<td>Update react/next/tailwind project to implement functions/hooks to fetch data from the API, instead of client-side JSON</td>
302+
</tr>
303+
<tr>
304+
<td>3</td>
305+
<td>Vercel</td>
306+
<td>Update Vercel configuration with required API environment variables, and CORS (Cross-Origin Resource Sharing) on the API to allow requests from Vercel domain</td>
307+
</tr>
308+
</tbody>
309+
</table>
310+
311+
### Deliverables
177312
- Data infrastructure with production level availability and security
178-
- Google cloud sql postgres instance
179-
- Highly available machines for running pipelines
180-
- Google sql proxy client/server connection daemon
181-
- Tailscale local client network
182-
- Dagster daemons: webserver for monitoring, and dagster execution daemon
183-
- Configure front end API service
184-
- Create staging/production relationship
185-
- Integrate Google Cloud Database Migration Service to ensure no downtime
186313
- Robust and efficient data pipelines that can easily expand to onboard new datasets
187-
- Pipelines that connect to the right data sources
188-
- Query buildout
189-
- API rate handling
190-
- Better testing built into pipelines to flag for issues
191314
- Data analytics methodology that simplifies platform front end work
192-
- Standardize analytics tables, queries, charts, and front end tools so that the product team can move fast and learn
315+
316+
**Why is this work necessary?**
317+
318+
We were able to create a minimum viable product using client-side json files generated locally using python. This process is fragile, not easily testable, will not scale, and client-side json is not suitable for complex queries. The project needs a server-side database and API to support growth beyond the MVP.
193319

194320
## Future Plans
195321

0 commit comments

Comments
 (0)