Time and Date for If a Field was Changed?
Hi Folks, I'm not sure this is the best way to put this together, but here is my scenario. I have 1 table (Intersections) that includes fields for "Name" (Text), 5 different "Route" Checkboxes, 5 different "Route Sequence" (Numbered 1 - 50ish), "Status" (Dropdown choice). Caveats: Each Intersection "Name" can have multiple "Route" Checkboxes checked. My Questions are: I would like to capture a Date Time, if the "Status" Dropdown was changed. AND Create a report based on the last Status time. (If Row Item status was changed within the last Day, add to report). Is there any way to do that? Thank you for any assistance. Will ------------------------------ William Wallace ------------------------------172Views0likes4CommentsPipeline trigger(s) based on formula field changes
I've been trying to run a Pipeline that is triggered based on formula field changes but the triggers don't seem to fire the Pipeline. For instance, I have a numeric formula field that changes daily based on a recommended frequency value - the days since last activity, e.g. [Recommended Frequency] - [Days Since Last Activity] = 1, tomorrow will equal 0 so the field changes but the Pipeline does not trigger. My advanced query is looking for this change. Field ID 181 is the formula field: {'181'.EX.'0'}. I also have a checkbox formula field that is super simple where it looks to see if the next preventative maintenance date is equal to today, e.g. [Next PM Date] = Today(), then true, else false. Now I've tried triggering based on either condition once they change, but the Pipeline doesn't seem to pick up on either change. Is it that Pipelines just don't trigger on formula field changes? ------------------------------ AR ------------------------------16Views0likes1CommentInformation on how to work Pipelines
With the automation feature, how could one set up a pipeline to uncheck a box in related records when a box is checked in a new record - similar to this discussion in 2020? ------------------------------ Lorrie B ------------------------------11Views0likes1CommentTable-to-Table Pipeline - Field Limit?
Hi, I'm creating a pipeline to connect 2 tables within 1 app. The source table has around 40 fields (size around 5 KB), and the target table has around 200 fields (with size around 4 MB). When configuring the "Prepare Bulk Record Upsert" action, the Fields for the large target table won't load, and it shows error "We're unable to show all options at this time. If you don't see what you're looking for, reload this page. If you need additional support, please contact us." (image attached) I have tried reloading, manually typing in the field ID (#s), and manually typing the field names in format: {{b.field_name}}, none of which has worked. Has anyone found a workaround for this issue? Or at least an explicit note from Quick Base on a field limit for these actions? Thank you! Shelby ------------------------------ Shelby Pons ------------------------------34Views0likes3CommentsCreate Record based on Due Date of Task -7 days
Good Day, I have an Assets table, each asset requires preventative maintenance on a schedule of X days (30, 60, 90 days, etc.) after last completed. This creates the PM Due Date. In the same App, I have a Work Order table that assigns the task to personnel based on the Asset selected. Is it possible to automate the creation of a new record (work order) when the due date of the PM (Asset table) is approaching (7 days prior). Any thoughts are appreciated. Landon ------------------------------ Landon Smallwood ------------------------------9Views0likes1CommentPipeline efficiency -- one or many?
Hi: I'm just getting started in pipelines. I've got the basic concept. One question I have is the most efficient options for some of my pipeline ideas for use in the Quickbase channel. I have essentially one trigger (record created) that will then add up to 5 "result" records in a separate table. Each "result" is slightly different, but the bulk of the information remains the same (out of currently about 40 fields, only 4 change in each of those up to 5 "results"). I anticipate the pipeline will usually create 1-3 records, with the average close to 2 or a little under. Later on, for each of those possible 5 result records, I'll need pipelines to handle updates and deletions. Dealing only with the create record pipeline for this question, is it going to be more efficient (on the app) to write 1 pipeline that triggers on the Record Created and then use If/then statements to create up to 5 result records? Or would it be more efficient to have 5 pipelines that each trigger slightly differently (based on the conditionals that would otherwise be used in only a single pipeline)? I'm assuming that with the answer provided I can extrapolate to pipelines to deal with Record Updated triggers? Record Deleted trigger is just a single pipeline because it is just delete all associated records without any conditionals or any additional loops apart from the initial search, so that won't change any. I can tell you from the building a pipeline standpoint, making 5 pipelines seems more efficient, simpler, and therefore more accurate than making 1 long pipeline to handle all 5 cases. And it seems like I could handle much of the duplication necessary with duplication of entire pipeline from the pipelines dashboard or with export/import of YAML files if need be. Thanks for the input, Dave ------------------------------ David Halter ------------------------------32Views0likes2CommentsUsing Pipelines to copy table data vs. cross-app relationships
I have what I think is a fairly simple question, but would like an opinion as to what the best approach is to take to accomplish something. I have two apps where I would like to copy table data from one app to another. I would like to copy a "zip codes" table from App B into App A and have a Pipeline keep these tables synced. Keep in mind, this zip code table is not changed very often, maybe a few times a week. About 90 users use App A throughout the day but there is a downtime from 8 PM to about 4 AM each night/morning where few, if any, users are in the system. Should I have the Pipeline copy data as things change, or trigger on change, or should I run a nightly update where this happens automatically in order to prevent any slowdown in App A as things sync? What's the most efficient approach? ------------------------------ AR ------------------------------15Views0likes0CommentsUsing AND & OR in Quickbase API Pipeline Channel
I am trying to build a table report in an email and I need to use both AND & OR to filter so finding the related records and if the status is "Received" or "Stored" or "Client Hold". What is the syntax to nest the OR statements. Here's what I have, but it's not working. {"from": "brce2frp8", "select": [33,89,117,11,70,63,27,28,29,72], "where": "{127.EX.'{{a.id|int}}'}AND ({71.EX.'Received'}OR{71.EX.'Stored'}OR{71.EX.'Client Hold'})", "sortBy": [ { "fieldId": 33, "order": "ASC" }, { "fieldId": 117, "order": "ASC" } ] } ------------------------------ Julie Meeker ------------------------------5Views0likes0CommentsCreate a Day Counter Since a Field Change
Hello, Is there a way for a day counter to reflect how long since a field has been altered? I have a STATUS field for a table and I want it to count days since the last time the status has changed. This is not to be confused with the process to count days since the item in the table was last altered, but that field specifically. Thank you so much for your support! ------------------------------ Anthony Wong ------------------------------20Views0likes1Comment