Block or Report
Block or report stephanie-wang
Report abuse
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abusePopular repositories
653 contributions in the last year
Less
More
Contribution activity
September 2022
Created 4 commits in 1 repository
Created a pull request in ray-project/ray that received 92 comments
[core] Support generators to allow tasks to return a dynamic number of objects
Why are these changes needed? This adds support for tasks that need to return a dynamic number of objects. When a remote generator function is inv…
+1,323
−284
•
92
comments
Opened 6 other pull requests in 1 repository
ray-project/ray
2
open
3
merged
1
closed
- [doc] Mark ray.put(_owner=actor) as experimental
- [Datasets] Use generators for block splitting
- Revert "[metrics] Force export census metrics on worker death"
- Revert "[Job Submission][CPP Worker] introduce CPP job submission"
- Revert "[train/horovod] Fix horovod long running release test"
- Revert "[tune] Raise error in PGF if head and worker bundles are empty"
Reviewed 15 pull requests in 2 repositories
ray-project/ray
14 pull requests
- [core] propagate oom exception when worker is killed due to oom
- [Doc] Revamp ray core design patterns doc [8/n]: pass large arg by value
- [core] Support generators to allow tasks to return a dynamic number of objects
- Handle starting worker throttling inside worker pool
- [Doc] Revamp ray core design patterns doc [7/n]: redefine task actor
- [Core] Unset RAY_RAYLET_NODE_ID for long running nightly tests
- [core] default memory threshold for release tests
- [Doc] Revamp ray core design patterns doc [6/n]: ray wait limits in-flight tasks
- Revert "Remove pins for some Python dependencies"
- Revert "Revert "[train/horovod] Fix horovod long running release test""
- Remove pins for some Python dependencies
- Revert "[core] turn on memory monitor by default"
- [Doc] Revamp ray core design patterns doc [5/n]: ray get in submission order
- [Doc] Revamp ray core design patterns doc [4/n]: unnecessary ray get
clarng/ray
1 pull request
Created an issue in ray-project/ray that received 14 comments
[core] chaos_many_actors failing due to node crash, GCS unreachable
What happened + What you expected to happen
command_scd_W8xe3jzdaCEE1qykk7UwiMEk.log
Nodes dying:
(raylet, ip=172.31.84.216) [2022-09-15 09:58:43,7…
14
comments
Opened 14 other issues in 1 repository
ray-project/ray
10
open
4
closed
- [docs][AIR] Session API ref link is broken
- [core] Generator task that returns more values than specified by num_returns should throw error instead
- [core] Object returned by a generator with num_returns="dynamic" should throw an error if reconstruction fails
- [core] Objects created by num_returns="dynamic" temporarily leak if generator errors
- [core] Support num_returns="dynamic" generators for actor tasks
- [core] @ray.method for actor methods is not documented
- [core] autoscaling_shuffle_1tb_1000_partitions failing on memory_monitor.cc assertion failure
- [core] shuffle_1tb_5000_partitions times out with ObjectStoreFullError
- [core] microbenchmark_staging release test is failing from ValueError
- [core] All long-running release tests that use a local Ray cluster are failing
- [core] Give a better error message when ray.remote requires can't serialize a global variable
- [core][tests] chaos_dataset_shuffle_push_based_sort_1tb fails with ray.exceptions.WorkerCrashedError
- [core] Make custom event profiling public
-
[CI]
linux://python/ray/tests:test_cancelis failing/flaky on master.
18
contributions
in private repositories
Sep 16 – Sep 23





